Esc
Boosting
Definition
Boosting is a sequential process where each subsequent model attempts to correct the errors of the previous model
How it works
Boosting consists of using sequentially weak learners where each iteration’s training focuses on previously misclassified instances in order to improve on the previous iteration. This process is continued iteratively until the final prediction is made by aggregating the previous predictions.
Considerations
Boosting can be computationally expensive, prone to overfitting, and slower to train compared to other ensemble methods.
There are three main types of Boosting algorithms
- Adaptive Boosting Adaptive Boosting (sometimes called AdaBoost) works by adding equal importance to each piece of a dataset and running it through the base learning algorithms. Every algorithm that errors, the boosting algorithm assigns a higher importance to. This continues until an acceptable level of confidence is reached.
- Gradient Boosting Gradient Boosting starts by training multiple models simultaneously to gather a strong estimate of strength to build new base learning algorithms.
- XGBoosting XGBoosting is a scalable tree boosting model. Using decision trees, weight is assigned to each variable and put into a decision tree. Outputs that are classified by the algorithm as wrong or weak are put into a second decision tree and the results form a stronger model.
References
Sciencedirect. (n.d.). Semi-supervised learning: An overview. Link
loading...
loading...
D3FEND™
A knowledge graph of cybersecurity countermeasures