104 - Model Distillation, with Victor Sanh and Thomas Wolf - a podcast by Allen Institute for Artificial Intelligence

from 2020-02-03T20:47:03

:: ::

In this episode we talked with Victor Sanh and Thomas Wolf from HuggingFace about model distillation, and DistilBERT as one example of distillation. The idea behind model distillation is compressing a large model by building a smaller model, with much fewer parameters, that approximates the output distribution of the original model, typically for increased efficiency. We discussed how model distillation was typically done previously, and then focused on the specifics of DistilBERT, including training objective, empirical results, ablations etc. We finally discussed what kinds of information you might lose when doing model distillation.

Further episodes of NLP Highlights

Further podcasts by Allen Institute for Artificial Intelligence

Website of Allen Institute for Artificial Intelligence