Episode 23: Why do ensemble methods work? - a podcast by Francesco Gadaleta

from 2017-10-03T13:05

:: ::

Ensemble methods have been designed to improve the performance of the single model, when the single model is not very accurate. According to the general definition of ensembling, it consists in building a number of single classifiers and then combining or aggregating their predictions into one classifier that is usually stronger than the single one.


The key idea behind ensembling is that some models will do well when they model certain aspects of the data while others will do well in modelling other aspects.
In this episode I show with a numeric example why and when ensemble methods work.

Further episodes of Data Science at Home

Further podcasts by Francesco Gadaleta

Website of Francesco Gadaleta