MENU

Fun & Interesting

Variational Inference by Automatic Differentiation in TensorFlow Probability

Machine Learning & Simulation 6,072 lượt xem 3 years ago
Video Not Working? Fix It Now

We find a surrogate posterior by maximizing the Evidence Lower Bound (ELBO). With a proposal distribution, this can be solved by Automatic Differentiation and a gradient-based optimizer. Here are the notes: https://raw.githubusercontent.com/Ceyron/machine-learning-and-simulation/main/english/probabilistic_machine_learning/vi_automatic_differentation_posterior_normal.pdf

Variational Inference can be hard to understand as there are many difficult components involved at the same time. However, if we outsource the tedious parts like taking derivatives etc. it might become easier to understand what is actually going. This is what this video is about: We will use the Automatic Differentiation of TensorFlow (Probability) to fit a surrogate posterior by a gradient-based optimization (e.g., by using the ADAM optimizer).

For this, we have to define the directed graphical model and need to find a way to approximate the Evidence Lower Bound (ELBO). All of it will be intuitively explained in the video, but is easily abstracted by TensorFlow Probability's function called "tfp.vi.fit_surrogate_posterior".

Any questions? I would be happy to answer them in the comment section :)

-------

📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation

📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler

💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim

-------

Timestamps:
00:00 Introduction
01:03 Example problem: Posterior Normal
01:27 Example problem: DGM
03:13 Example problem: Observed Data
03:44 Example problem: seek posterior
04:27 Example problem: True Posterior
06:08 Surrogate Posterior proposal
06:53 Variational Optimization to Vector Optimization
08:13 Plugging in the ELBO
09:02 Access to the log joint
10:00 Changing to minimization
10:28 ELBO based on scalars
10:58 Automatic Differentiation for obtaining gradients
12:25 Gradient-Based optimization
13:31 We know the true posterior
13:57 Approximating the ELBO by sampling
18:19 TFP: Creating a dataset
19:19 TFP: Parameters of True Posterior
21:58 TFP: Defining the log-joint
26:20 TFP: Creating the Surrogate Posterior
28:35 TFP: ELBO and Gradient Tape (Automatic Differentiation)
31:31 TFP: Performing Variational Inference
33:39 TFP: Comparing True and Surrogate Posterior
35:06 Outro

Comment