The Fifth Elephant 2015

A conference on data, machine learning, and distributed and parallel computing

Swaroop Krothapalli


Ensemble Learning

Submitted Jun 14, 2015

To understand most basic and convenient approaches of ensembling


Ensemble learning is all about combining predictions from different machine learning techniques in order to create a stronger overall prediction. For example, the predictions of a random forest, a support vector machine, and a simple linear model may be combined to create a stronger final prediction set. The key to creating a powerful ensemble is model diversity. An ensemble with models that are very similar in nature will perform lower than a more diverse model set.

Ensemble learning can be broken down into two tasks: developing a population of base learners from the training data, and then combining them to form the composite predictor.In this talk I would like to give a basic overview of ensemble learning. And discuss about building an ensemble model by conducting a regularized and supervised search in a high-dimensional space of weak learners.

No wonder most of the winning submissions in Kaggle are ensemble models.

Speaker bio

Currently part of data science team at Fidelity Investments, Business Analytics and Research.

Master’s in Mathematics from BITS, Pilani.
Am an open-source enthusiast and Kaggler.


{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

All about data science and machine learning