MENU

Fun & Interesting

The Key Equation Behind Probability

Artem Kirsanov 222,645 lượt xem 7 months ago
Video Not Working? Fix It Now

Get 4 months extra on a 2 year plan here: https://nordvpn.com/artemkirsanov. It’s risk free with Nord’s 30 day money-back guarantee!

Socials:
X/Twitter: https://x.com/ArtemKRSV
Patreon: https://www.patreon.com/artemkirsanov

My name is Artem, I'm a graduate student at NYU Center for Neural Science and researcher at Flatiron Institute (Center for Computational Neuroscience).

In this video, we explore the fundamental concepts that underlie probability theory and its applications in neuroscience and machine learning. We begin with the intuitive idea of surprise and its relation to probability, using real-world examples to illustrate these concepts.
From there, we move into more advanced topics:
1) Entropy – measuring the average surprise in a probability distribution.
2) Cross-entropy and the loss of information when approximating one distribution with another.
3) Kullback-Leibler (KL) divergence and its role in quantifying the difference between two probability distributions.

OUTLINE:
00:00 Introduction
02:00 Sponsor: NordVPN
04:07 What is probability (Bayesian vs Frequentist)
06:42 Probability Distributions
10:17 Entropy as average surprisal
13:53 Cross-Entropy and Internal models
19:20 Kullback–Leibler (KL) divergence
20:46 Objective functions and Cross-Entropy minimization
24:22 Conclusion & Outro

CREDITS:
Special thanks to Crimson Ghoul for providing English subtitles!

Icons by https://www.freepik.com/

Comment