arXiv Analytics

Sign in

arXiv:2204.11018 [cs.CV]AbstractReferencesReviewsResources

Exploring Negatives in Contrastive Learning for Unpaired Image-to-Image Translation

Yupei Lin, Sen Zhang, Tianshui Chen, Yongyi Lu, Guangping Li, Yukai Shi

Published 2022-04-23Version 1

Unpaired image-to-image translation aims to find a mapping between the source domain and the target domain. To alleviate the problem of the lack of supervised labels for the source images, cycle-consistency based methods have been proposed for image structure preservation by assuming a reversible relationship between unpaired images. However, this assumption only uses limited correspondence between image pairs. Recently, contrastive learning (CL) has been used to further investigate the image correspondence in unpaired image translation by using patch-based positive/negative learning. Patch-based contrastive routines obtain the positives by self-similarity computation and recognize the rest patches as negatives. This flexible learning paradigm obtains auxiliary contextualized information at a low cost. As the negatives own an impressive sample number, with curiosity, we make an investigation based on a question: are all negatives necessary for feature contrastive learning? Unlike previous CL approaches that use negatives as much as possible, in this paper, we study the negatives from an information-theoretic perspective and introduce a new negative Pruning technology for Unpaired image-to-image Translation (PUT) by sparsifying and ranking the patches. The proposed algorithm is efficient, flexible and enables the model to learn essential information between corresponding patches stably. By putting quality over quantity, only a few negative patches are required to achieve better results. Lastly, we validate the superiority, stability, and versatility of our model through comparative experiments.

Comments: We found that negatives show better effects in contrastive learning by adopting a sample pruning constraint
Categories: cs.CV, cs.AI, cs.LG
Related articles: Most relevant | Search more
arXiv:2106.09958 [cs.CV] (Published 2021-06-18)
Novelty Detection via Contrastive Learning with Negative Data Augmentation
arXiv:2005.10243 [cs.CV] (Published 2020-05-20)
What makes for good views for contrastive learning
arXiv:2201.12813 [cs.CV] (Published 2022-01-30)
Contrastive Learning from Demonstrations