The Fifth Elephant 2015
A conference on data, machine learning, and distributed and parallel computing
Jul 2015
13 Mon
14 Tue
15 Wed
16 Thu 08:30 AM – 06:35 PM IST
17 Fri 08:30 AM – 06:30 PM IST
18 Sat 09:00 AM – 06:30 PM IST
19 Sun
Swaroop Krothapalli
To understand most basic and convenient approaches of ensembling
Ensemble learning is all about combining predictions from different machine learning techniques in order to create a stronger overall prediction. For example, the predictions of a random forest, a support vector machine, and a simple linear model may be combined to create a stronger final prediction set. The key to creating a powerful ensemble is model diversity. An ensemble with models that are very similar in nature will perform lower than a more diverse model set.
Ensemble learning can be broken down into two tasks: developing a population of base learners from the training data, and then combining them to form the composite predictor.In this talk I would like to give a basic overview of ensemble learning. And discuss about building an ensemble model by conducting a regularized and supervised search in a high-dimensional space of weak learners.
No wonder most of the winning submissions in Kaggle are ensemble models.
Currently part of data science team at Fidelity Investments, Business Analytics and Research.
Master’s in Mathematics from BITS, Pilani.
Am an open-source enthusiast and Kaggler.
https://in.linkedin.com/in/kswaroop
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}