Integrating Prior Knowledge in Contrastive Learning with Kernel - Laboratoire Traitement et Communication de l'Information Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Integrating Prior Knowledge in Contrastive Learning with Kernel

Résumé

Data augmentation is a crucial component in unsupervised contrastive learning (CL). It determines how positive samples are defined and, ultimately, the quality of the learnt representation. In this work, we open the door to new perspectives for CL by integrating prior knowledge, given either by generative models-viewed as prior representations-or weak attributes in the positive and negative sampling. To this end, we use kernel theory to propose a novel loss, called decoupled uniformity, that i) allows the integration of prior knowledge and ii) removes the negative-positive coupling in the original InfoNCE loss. We draw a connection between contrastive learning and conditional mean embedding theory to derive tight bounds on the downstream classification loss. In an unsupervised setting, we empirically demonstrate that CL benefits from generative models to improve its representation both on natural and medical images. In a weakly supervised scenario, our framework outperforms other unconditional and conditional CL approaches. Source code is available at this https URL.
Fichier principal
Vignette du fichier
2206.01646.pdf (790.18 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04111825 , version 1 (31-05-2023)

Identifiants

  • HAL Id : hal-04111825 , version 1

Citer

Benoit Dufumier, Carlo Alberto Barbano, Robin Louiset, Edouard Duchesnay, Pietro Gori. Integrating Prior Knowledge in Contrastive Learning with Kernel. 40 th International Conference on Machine Learning, Jul 2023, Honolulu, United States. ⟨hal-04111825⟩
134 Consultations
137 Téléchargements

Partager

Gmail Facebook X LinkedIn More