This video discusses Residual Networks, one of the most popular machine learning architectures that has enabled considerably deeper neural networks through jump/skip connections. This architecture mimics many of the aspects of a numerical integrator.
This video was produced at the University of Washington, and we acknowledge funding support from the Boeing Company
%%% CHAPTERS %%%
00:00 Intro
01:09 Concept: Modeling the Residual
03:26 Building Blocks
05:59 Motivation: Deep Network Signal Loss
07:43 Extending to Classification
09:00 Extending to DiffEqs
10:16 Impact of CVPR and Resnet
12:17 Resnets and Euler Integrators
13:34 Neural ODEs and Improved Integrators
16:07 Outro