56 - Deep contextualized word representations, with Matthew Peters - a podcast by Allen Institute for Artificial Intelligence

from 2018-04-04T21:22:36

:: ::

NAACL 2018 paper, by Matt Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Chris Clark, Kenton Lee, and Luke Zettlemoyer.

In this episode, AI2's own Matt Peters comes on the show to talk about his recent work on ELMo embeddings, what some have called "the next word2vec". Matt has shown very convincingly that using a pre-trained bidirectional language model to get contextualized word representations performs substantially better than using static word vectors. He comes on the show to give us some more intuition about how and why this works, and to talk about some of the other things he tried and what's coming next.

https://www.semanticscholar.org/paper/Deep-contextualized-word-representations-Peters-Neumann/4b17597b856c087f109381ce77d60d9017cb6f9a

Further episodes of NLP Highlights

Further podcasts by Allen Institute for Artificial Intelligence

Website of Allen Institute for Artificial Intelligence