Conditional contrastive learning frameworks consider the conditional sampling procedure that constructs positive or negative data pairs conditioned on specific variables.
Although conditional contrastive learning enables many applications, the conditional sampling procedure can be challenging if we can not obtain sufficient data pairs for some values of the conditioning variable.
In this paper we provide a comprehensive literature review and we propose a general contrastiverepresentation learning framework that simplifies and unifies many different contrastive learning methods.
We also provide a taxonomy for each of the components of contrastive learning in order to summarise it and distinguish it from other forms of machine learning.
We propose explanation guided augmentations (ega) and explanation guidedcontrastive learning for sequential recommendation (ec4srec) model framework to address data sparsity caused by users with few iteminteractions and items with few user adoptions.
The key idea behind ega is to utilize explanation method(s) to determine items'importance in a user sequence and derive the positive and negative sequences accordingly.