Many animations used in this video came from Jonathan Barron [1, 2]. Give this researcher a like for his hard work!
SUBSCRIBE FOR MORE CONTENT!
RESEOURCES
[1] Paper on adaptive loss function: https://arxiv.org/abs/1701.03077
[2] CVPR paper presentation: https://www.youtube.com/watch?v=BmNKbnF69eY
[3] Regression Loss Functions: https://alexisalulema.com/2017/12/07/loss-functions-part-1/
[4] Classification Losses: https://alexisalulema.com/2017/12/07/loss-functions-part-1/
[5] ML cheat sheet for loss functions: https://ml-cheatsheet.readthedocs.io/en/latest/loss_functions.html
[6] 7 loss functions with python code: https://www.analyticsvidhya.com/blog/2019/08/detailed-guide-7-loss-functions-machine-learning-python-code/
[7] A Blog for most common Loss Functions: https://towardsdatascience.com/understanding-the-3-most-common-loss-functions-for-machine-learning-regression-23e0ef3e14d3
[8] Modeling the Huber loss: https://www.textbook.ds100.org/ch/10/modeling_abs_huber.html
[9] Notes on Subgradients: https://see.stanford.edu/materials/lsocoee364b/01-subgradients_notes.pdf
[10] Code to get up to speed: https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.SGDRegressor.html
[11] What is the difference between KL divergence and Cross Entropy loss: https://stats.stackexchange.com/questions/357963/what-is-the-difference-cross-entropy-and-kl-divergence
[12] A great video explanation of Entropy, Cross Entropy and KL divergence: https://www.youtube.com/watch?v=ErfnhcEV1O8