KL Divergence - a podcast by Ben Jaffe and Katie Malone

from 2017-08-07T03:07:15

:: ::

Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE.  And boy oh boy can it be tough to explain.  But we're trying our hardest in this episode!

Further episodes of Linear Digressions

Further podcasts by Ben Jaffe and Katie Malone

Website of Ben Jaffe and Katie Malone