Sebastian's books: https://sebastianraschka.com/books/
In this video, we will take the concept of boosting a step further and talk about gradient boosting. Where AdaBoost uses weights for training examples to boost the trees in the next round, gradient boosting uses the gradients of the loss to compute residuals on which the next tree in the sequence is fit.
XGBoost paper mentioned in the video: https://dl.acm.org/doi/pdf/10.1145/2939672.2939785
Link to the code: https://github.com/rasbt/stat451-machine-learning-fs20/tree/master/L07/code
-------
This video is part of my Introduction of Machine Learning course.
Next video: https://youtu.be/r5C3TUIw6Zk
The complete playlist: https://www.youtube.com/playlist?list=PLTKMiZHVd_2KyGirGEvKlniaWeLOHhUF3
A handy overview page with links to the materials: https://sebastianraschka.com/blog/2021/ml-course.html
-------
If you want to be notified about future videos, please consider subscribing to my channel: https://youtube.com/c/SebastianRaschka