Article Source
Contrastive Learning in PyTorch
▬▬ Notes ▬▬▬▬▬▬▬▬▬▬▬
Two small things I realized when editing this video
- SimCLR uses two separate augmented views as positive samples
- Many frameworks have separate projection heads on the learned representations which transforms them additionally for the contrastive loss
▬▬ Papers/Sources ▬▬▬▬▬▬▬
- Intro: https://sthalles.github.io/a-few-words-on-representation-learning/
- Survey: https://arxiv.org/ftp/arxiv/papers/2010/2010.05113.pdf
- Supervised Contrastive Learning: https://arxiv.org/abs/2004.11362
- Contrastive Loss: https://medium.com/@maksym.bekuzarov/losses-explained-contrastive-loss-f8f57fe32246
- Triplet Loss: https://towardsdatascience.com/triplet-loss-advanced-intro-49a07b7d8905
- NT-Xent Loss: https://medium.datadriveninvestor.com/simclr-part-2-the-encoder-projection-head-and-loss-function-809a64f30d4a
- SimCLR