Ensemble Learning

Ensemble model combines multiple ‘individual’ (diverse) models together and delivers superior prediction power.

Bagging

Bagging is an approach where you take random samples of data, build learning algorithms and take simple means to find bagging probabilities.

Boosting

The term ‘Boosting’ refers to a family of algorithms which converts weak learner to strong learners.

Stacking

First, we use multiple base classifiers to predict the class. Second, a new learner is used to combine their predictions with the aim of reducing the generalization error.

A good model should maintain a balance between bias-variance. This is known as the trade-off management of bias-variance errors. Ensemble learning is one way to execute this trade off analysis.