MENU

Fun & Interesting

Introduction to KL-Divergence | Simple Example | with usage in TensorFlow Probability

Machine Learning & Simulation 17,382 lượt xem 4 years ago
Video Not Working? Fix It Now

The Kullback-Leibler-Divergence measure "how far two probability distributions are apart". We can conveniently calculate it by the help of TensorFlow Probability. Here are the notes: https://raw.githubusercontent.com/Ceyron/machine-learning-and-simulation/main/english/probabilistic_machine_learning/kl_divergence_intro.pdf

The KL-Divergence is especially relevant when we want to fit one distribution against another. It has multiple applications in Probabilistic Machine Learning and Statistics. In a later video, we will use it to derive Variational Inference, a powerful tool to fit surrogate posterior distributions.

-------

📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation

📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler

💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim

-------

Timestamps:
0:00 Opening
0:15 Intuition
03:21 Definition
05:28 Example
13:29 TensorFlow Probability

Comment