[D] Paper Explained – Big Self-Supervised Models are Strong Semi-Supervised Learners (Full Video Analysis)

Chromium and Mozilla to enforce 1yr validity for TLS certificates
June 29, 2020
How to achieve such SVG animation from vanilla JS & CSS?
June 29, 2020

[D] Paper Explained – Big Self-Supervised Models are Strong Semi-Supervised Learners (Full Video Analysis)


https://youtu.be/2lkUNDZld-4

This paper proposes SimCLRv2 and shows that semi-supervised learning benefits a lot from self-supervised pre-training. And stunningly, that effect gets larger the fewer labels are available and the more parameters the model has.

OUTLINE:

0:00 – Intro & Overview

1:40 – Semi-Supervised Learning

3:50 – Pre-Training via Self-Supervision

5:45 – Contrastive Loss

10:50 – Retaining Projection Heads

13:10 – Supervised Fine-Tuning

13:45 – Unsupervised Distillation & Self-Training

18:45 – Architecture Recap

22:25 – Experiments

34:15 – Broader Impact

Paper: https://arxiv.org/abs/2006.10029

Code: https://github.com/google-research/simclr

submitted by /u/ykilcher
[link] [comments]

Source

Comments are closed.