Ensemble Learning

Bagging (bootstrap aggregating)
A simple and straightforward way of ensembling models by averaging results from multiple models. Each model is trained with a fraction of data with replacement. Each model votes with equal weight: averaging for regression and majority vote for classification.
E.g. random forests

Boosting
Train models sequentially. Start with equally weighted data.
Increase weights on misclassified data for the next model.
So on and so forth…
E.g. AdaBoosting

Stacking
Train a model that takes the output of multiple models as input.

Advertisements