Ensemble Learning

Bagging (bootstrap aggregating)
A simple and straightforward way of ensembling models by averaging results from multiple models. Each model is trained with a fraction of data with replacement. Each model votes with equal weight: averaging for regression and majority vote for classification.
E.g. random forests

Boosting
Train models sequentially. Start with equally weighted data.
Increase weights on misclassified data for the next model.
So on and so forth…
E.g. AdaBoosting

Stacking
Train a model that takes the output of multiple models as input.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s