Introduction to Contrastive Loss

Contrastive Loss is a widely used objective in metric learning and contrastive learning. Its goal is to learn an embedding space where similar samples are close together, while dissimilar samples are far apart. The loss operates on pairs of samples: Positive pairs: two samples that should be considered similar Negative pairs: two samples that should be considered different Given a pair of embeddings and a binary label, contrastive loss: penalizes large distances between positive pairs penalizes small distances between negative pairs (up to a margin) This encourages the model to learn representations that are discriminative and geometry-aware. ...

January 9, 2026 · 1 min