Contrastive Learning in 3 Minutes

#artificialintelligence 

After a few years of research steered towards the supervised domain of image recognition tasks, many have now turned to a much more unexplored territory: performing the same tasks through a self-supervised learning manner. One of the cornerstones that lead to the dramatic advancements in this seemingly impossible task is the introduction of contrastive learning losses. This article dives into some of the recently proposed contrastive losses that have pushed the results of unsupervised learning to heights similar to supervised learning. One of the earliest contrastive learning losses proposed was the InfoNCE loss by Oord et al. Ultimately, this simple loss forces the positive pairs to have a greater value (pushing the log term to 1 and hence less to 0) and negative pairs further apart.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found