https://thinkingneuron.com/data-science-interview-questions-for-it-industry-part-3-supervised-ml/#XGBoost
XGboost is the short form for Xtreme Gradient Boosting.
It is one of the most popular algorithms in the machine learning universe. It is used by winners at most of the hackathons!
The reason for its popularity is listed below
* Handles large datasets with ease
* Trains very fast
* High accuracy for most of the datasets
The key differentiation about XGB is how exactly it corrects the mistakes of the previous model. In previous algorithm Adaboost, I discussed how it gives higher weights to the incorrectly predicted rows, and the next model focuses to reduce that.
In XGboost, the approach is focused on reducing the error gradient (Mistakes of the previous model).