#vectors #linearalgebra #matrices
The Singular Value Decomposition is one of the most important algorithms in linear algebra. It looks for the ellipse that is hidden in all linear transformations. The ellipse reveals the most important "directions" of the transformation, so that we can extract the most meaningful concepts from a huge data set. We show how this works for the classification of human faces and for movie review prediction. Welcome to the world of artificial intelligence.
Small correction at 21:12: Some of the columns do have non-zero coordinates in common of course, but they happen to cancel each other out, so the dot product is still zero.
To watch exclusive content, and to get access to new videos many months before they appear on Youtube, consider supporting us on Patreon: https://www.patreon.com/user?u=86649007
If you want to dig deeper into the singular value decomposition, here's a long list of interesting links for you to explore:
[SERR 1] https://www.youtube.com/watch?v=DG7YTlGnCEo
One of the best introductions to the SVD. It clearly shows the rotation, scaling, and second rotation. Also shows how we can compress a matrix by ignoring small singular values. And then he applies the SVD to image compression, by removing the least impactful singular values. Very cool!
[SERR 2] https://www.youtube.com/watch?v=g-Hb26agBFg
Principal component analysis.
[SERR 3] https://www.youtube.com/watch?v=WBlnwvjfMtQ
The covariance matrix.
[STO 1] https://www.youtube.com/watch?v=CpD9XlTu3ys
Explains the SVD from a different angle. Instead of trying to map a circle to an ellipse, we ask: how do we maintain orthogonality? The result is again the SVD.
[AIO 1] https://www.youtube.com/watch?v=P5mlg91as1c
[AIO 2] https://www.youtube.com/watch?v=K38wVcdNuFc
Using the SVD to discover movie genres from review data, and to predict whether or not a specific user will like a specific movie.
[AIO 3] https://www.youtube.com/watch?v=UyAfmAZU_WI
These SVD techniques allow you to compress your data to much smaller sizes, and to reduce its number of dimensions to make it easier to analyze.
[SB 1] https://www.youtube.com/watch?v=gXbThCXjZFM&list=PLMrJAkhIeNNSVjnsviglFoY2nXildDCcv
This is a very good series about the SVD. It's explained very clearly, and it shows many concrete examples with actual Python code.
[SB 2] https://www.youtube.com/watch?v=H7qMMudo3e8
Steve Brunton shows actual Python code for compressing an image of his dog. He also plots the size of the singular values so that you can gauge how many of them are small enough to ignore.
[MIT 1] https://www.youtube.com/watch?v=UCc9q_cAhho
Proves that a symmetric matrix always has real eigenvalues and orthogonal eigenvectors.
[BN 1] https://www.youtube.com/watch?v=yfj8uMwAgrI
This video is a bit more theoretical and deep, but it draws some interesting connections. It uses the SVD to look at 4 fundamental spaces for a matrix, which gives us an alternative intuition about the [/transpose].
0:00 Introduction
2:59 Decomposing a matrix into 3 parts
5:44 The covariance matrix
11:55 PCA in higher dimensions
13:55 Human faces
17:59 Movie reviews
23:25 How to calculate the SVD
28:39 Please support our channel
29:04 Comparison between eigenstuff and SVD
This video is published under a CC Attribution license
( https://creativecommons.org/licenses/by/4.0/ )