arXiv Analytics

Sign in

arXiv:2310.06918 [cs.CL]AbstractReferencesReviewsResources

Improving Contrastive Learning of Sentence Embeddings with Focal-InfoNCE

Pengyue Hou, Xingyu Li

Published 2023-10-10Version 1

The recent success of SimCSE has greatly advanced state-of-the-art sentence representations. However, the original formulation of SimCSE does not fully exploit the potential of hard negative samples in contrastive learning. This study introduces an unsupervised contrastive learning framework that combines SimCSE with hard negative mining, aiming to enhance the quality of sentence embeddings. The proposed focal-InfoNCE function introduces self-paced modulation terms in the contrastive objective, downweighting the loss associated with easy negatives and encouraging the model focusing on hard negatives. Experimentation on various STS benchmarks shows that our method improves sentence embeddings in terms of Spearman's correlation and representation alignment and uniformity.

Related articles: Most relevant | Search more
arXiv:2305.13192 [cs.CL] (Published 2023-05-22)
ImSimCSE: Improving Contrastive Learning for Sentence Embeddings from Two Perspectives
arXiv:2305.01918 [cs.CL] (Published 2023-05-03)
Improving Contrastive Learning of Sentence Embeddings from AI Feedback
arXiv:1605.04655 [cs.CL] (Published 2016-05-16)
Joint Learning of Sentence Embeddings for Relevance and Entailment