大家好,我是对白。
最近对比学习真的太火了,已然成为各大顶会争相投稿的一个热门领域,而它火的原因也很简单,就是因为它解决了有监督训练标注数据有限这个典型问题(这个问题在工业界非常滴常见)。所以对比学习的出现,给CV、NLP和推荐都带来了极大的福音,具体来说:
1、在CV领域,解决了“在没有更大标注数据集的情况下,如何采用自监督预训练模式,来从中吸取图像本身的先验知识分布,得到一个预训练模型”的问题;
2、在NLP领域,验证了”自监督预训练使用的数据量越大,模型越复杂,那么模型能够吸收的知识越多,对下游任务效果来说越好“这样一个客观事实;
3、在推荐领域,解决了以下四个原因:数据的稀疏性、Item的长尾分布、跨域推荐中多个不同的view聚合问题以及增加模型的鲁棒性或对抗噪音,感兴趣地可以看我写的这篇文章:推荐系统中不得不学的对比学习(Contrastive Learning)方法
因此为了更加清楚的掌握对比学习的前沿方向与最新进展,我为大家整理了最近一年来各大顶会中对比学习相关的论文,一共涵盖:
ICLR2021,SIGIR2021,WWW2021,CVPR2021,AAAI2021,NAACL2021,ICLR2020,NIPS2020,CVPR2020,ICML2020,KDD2020共十一个会议60多篇论文。本次整理以long paper和research paper为主,也包含少量的short paper和industry paper。由于工作量巨大,难免有疏漏,欢迎大家在评论区指正。
本文整理的论文列表已经同步更新到GitHub,GitHub上会持续更新顶会论文,欢迎大家关注和star~
https://github.com/coder-duibai/Contrastive-Learning-Papers-Codes
分成九类
我将60多篇论文和它们的代码,分到了九个类别里。
1.Computer Vision
第一类是计算机视觉,也是内容最饱满的章节,有19篇论文的代码。
不乏最近非常著名的模型,例如何恺明提出的MoCo和MoCo v2以及Geoffrey Hinton提出的SimCLR和SimCLR v2便属于这一类。
1. [PCL] Prototypical Contrastive Learning of Unsupervised
Representations.ICLR2021. Authors:Junnan Li, Pan Zhou, Caiming Xiong, Steven C.H. Hoi. paper code
2. [BalFeat] Exploring Balanced Feature Spaces for Representation Learning. ICLR2021.Authors:Bingyi Kang, Yu Li, Sa Xie, Zehuan Yuan, Jiashi Feng. paper
3. [MiCE] MiCE: Mixture of Contrastive Experts for Unsupervised Image Clustering. ICLR2021. Authors:Tsung Wei Tsai, Chongxuan Li, Jun Zhu. paper code
4. [i-Mix] i-Mix: A Strategy for Regularizing Contrastive Representation Learning. ICLR2021.
Authors:Kibok Lee, Yian Zhu, Kihyuk Sohn, Chun-Liang Li, Jinwoo Shin, Honglak Lee. paper code
5. Contrastive Learning with Hard Negative Samples.ICLR2021.
Authors:Joshua Robinson, Ching-Yao Chuang, Suvrit Sra, Stefanie Jegelka. paper code
6. [LooC] What Should Not Be Contrastive in Contrastive Learning. ICLR2021.
Authors:Tete Xiao, Xiaolong Wang, Alexei A. Efros, Trevor Darrell. paper
7. [MoCo] Momentum Contrast for Unsupervised Visual Representation Learning. CVPR2020.
Authors:Kaiming He, Haoqi Fan, Yuxin Wu, Saining Xie, Ross Girshick. paper code
8. [MoCo v2] Improved Baselines with Momentum Contrastive Learning.
Authors:Xinlei Chen, Haoqi Fan, Ross Girshick, Kaiming He. paper code
9. [SimCLR] A Simple Framework for Contrastive Learning of Visual Representations. ICML2020. Authors:Ting Chen, Simon Kornblith, Mohammad Norouzi, Geoffrey Hinton. paper code
10. [SimCLR v2] Big Self-Supervised Models are Strong Semi-Supervised Learners. NIPS2020.
Authors:Ting Chen, Simon Kornblith, Kevin Swersky, Mohammad Norouzi, Geoffrey Hinton. paper code
11. [BYOL] Bootstrap your own latent: A new approach to self-supervised Learning.
Authors:Jean-Bastien Grill, Florian Strub, Florent Altché, Corentin Tallec, Pierre H, etc.
12. [SwAV] Unsupervised Learning of Visual Features by Contrasting Cluster Assignments. NIPS2020. Authors:Mathilde Caron, Ishan Misra, Julien Mairal, Priya Goyal, Piotr Bojanowski, Armand Joulin. paper code
13. [SimSiam] Exploring Simple Siamese Representation Learning. CVPR2021.
Authors:Xinlei Chen, Kaiming He. paper code
14. Hard Negative Mixing for Contrastive Learning. NIPS2020.
Authors:Yannis Kalantidis, Mert Bulent Sariyildiz, Noe Pion, Philippe Weinzaepfel, Diane Larlus. paper
15. Supervised Contrastive Learning. NIPS2020. Authors:Prannay Khosla, Piotr Teterwak, Chen Wang, Aaron Sarna, Yonglong Tian, Phillip Isola, Aaron Maschinot, Ce Liu, Dilip Krishnan. paper
16. [LoCo] LoCo: Local Contrastive Representation Learning. NIPS2020.
Authors:Yuwen Xiong, Mengye Ren, Raquel Urtasun. paper
17. What Makes for Good Views for Contrastive Learning?. NIPS2020.
Authors:Yonglong Tian, Chen Sun, Ben Poole, Dilip Krishnan, Cordelia Schmid, Phillip Isola. paper
18. [ContraGAN] ContraGAN: Contrastive Learning for Conditional Image Generation. NIPS2020.
Authors:Minguk Kang, Jaesik Park. paper code
19. [SpCL] Self-paced Contrastive Learning with Hybrid Memory for Domain Adaptive Object Re-ID. NIPS2020.
Authors:Yixiao Ge, Feng Zhu, Dapeng Chen, Rui Zhao, Hongsheng Li. paper code