Lecture

Self-supervised learning

Description

This lecture covers the concept of self-supervised learning, focusing on good embedding techniques for downstream tasks, the use of HSIC to prevent network collapse, and the importance of geometric information. It also explores the HSIC loss function, one-hot encoding, and the connection with clustering. The lecture emphasizes the correlation between feature embedding and identity, the penalization of high variance representations, and the simplifications brought by the special structure of the data. Additionally, it delves into the inner product intuition and the collapse possibilities in the context of self-supervised learning.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.