Ensemble have two type technique which is known as bagging and boosting.
Bagging
- Building many predictors or model
- Take random sub-sample or bootstrap of row(data)
- Average the result or take the majority vote
- Every model must loosely correlate with each other to reduce variance
Boosting
- Predictors are made sequentially
- Predictors learn from previous predictor mistake
- Very fast
- Can lead to overfitting
Reference
https://medium.com/mlreview/gradient-boosting-from-scratch-1e317ae4587d https://www.dataquest.io/blog/introduction-to-ensembles/