Contribute Media
A thank you to everyone who makes this possible: Read More

Understanding and Applying Self-Attention for NLP

Description

Understanding attention mechanisms and self-attention, presented in Google's "Attention is all you need" paper, is a beneficial skill for anyone who works on complex NLP problems. In this talk, we will go over the main parts of the Google Transformer self-attention model and the intuition behind it. Then we will look on how this architecture can be used for other NLP tasks, i.e. slot filling.

Details

Improve this page