GMMs are used for clustering data or as generative models. Let's start with understanding by looking at a one-dimensional 1D example. Here are the notes: https://raw.githubusercontent.com/Ceyron/machine-learning-and-simulation/main/english/probabilistic_machine_learning/univariate_gaussian_mixture_intro.pdf If your (univariate) distribution has more than one mode (peaks), there is a good chance you can model it with a Gaussian Mixture Model (GMM), a Mixture Distribution of Gaussian/Normal. That is helpful for a soft clustering of points in one dimension. For this you select the number of modes you expect (= the number of peaks). This will then correspond to the number of (latent) classes as well as the number of Gaussians that have to be defined. In this video, I provide an intuition to this by looking at the grade distribution after an exam, with a first peak at 2.5 and a second peak at the grade corresponding to a fail. We will implement this model in TensorFlow Probability. ------- 📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation 📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler 💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim ------- Timestamps: 00:00 Introduction 00:38 A Multi-Modal Distribution 01:10 Clustering of Points 02:04 A Superposition of Gaussians? 03:59 Using Mixture Coefficients 05:05 A special case of Mixture Distributions 05:33 The Directed Graphical Model 07:52 Alternative Model with plates 08:45 The joint 10:28 TFP: Defining the Parameters 11:27 TFP: The Categorical 12:12 TFP: The batched Normal 13:13 TFP: GMM in Principle 14:13 TFP: Using the TFP Mixture Distribution 15:15 TFP: Plotting the probability density 17:05 Outro