June 28, 2020

[D] Opinion on AltDeep.ai's causality course

Hi, I am interested in the application of causality in machine learning. Most of the online courses out there focus on the basics with some examples […]
June 29, 2020

[D] Glow – The Representations Must Flow

Get interpretable latent representations by composing non-linear invertible functions and maximizing the exact log-likelihood. Flow-based models are the odd machines in the corner of the neural […]
June 29, 2020

[D] Is there a way to differentiable sample from entmax?

We can sample from softmax by applying Gumbel noise. Can it also be applied to entmax output? Or is there another way? submitted by /u/rx303 [link] […]
June 29, 2020

[D] Paper Explained – Big Self-Supervised Models are Strong Semi-Supervised Learners (Full Video Analysis)

https://youtu.be/2lkUNDZld-4 This paper proposes SimCLRv2 and shows that semi-supervised learning benefits a lot from self-supervised pre-training. And stunningly, that effect gets larger the fewer labels are […]