Ensemble Algorithms - a podcast by Ben Jaffe and Katie Malone

from 2017-01-23T02:31:26

:: ::

If one machine learning model is good, are two models better? In a lot of cases, the answer is yes. If you build many ok models, and then bring them all together and use them in combination to make your final predictions, you've just created an ensemble model. It feels a little bit like cheating, like you just got something for nothing, but the results don't like: algorithms like Random Forests and Gradient Boosting Trees (two types of ensemble algorithms) are some of the strongest out-of-the-box algorithms for classic supervised classification problems. What makes a Random Forest random, and what does it mean to gradient boost a tree? Have a listen and find out.

Further episodes of Linear Digressions

Further podcasts by Ben Jaffe and Katie Malone

Website of Ben Jaffe and Katie Malone