Attention in Neural Nets - a podcast by Ben Jaffe and Katie Malone

from 2019-06-17T00:28:35

:: ::

There’s been a lot of interest lately in the attention mechanism in neural nets—it’s got a colloquial name (who’s not familiar with the idea of “attention”?) but it’s more like a technical trick that’s been pivotal to some recent advances in computer vision and especially word embeddings. It’s an interesting example of trying out human-cognitive-ish ideas (like focusing consideration more on some inputs than others) in neural nets, and one of the more high-profile recent successes in playing around with neural net architectures for fun and profit.

Further episodes of Linear Digressions

Further podcasts by Ben Jaffe and Katie Malone

Website of Ben Jaffe and Katie Malone