Contribute Media
A thank you to everyone who makes this possible: Read More

Less Talk, More Rock: Transfer Learning with Natural Language Processing

Description

Natural language processing models require a ton of text data and computing time - resources that aren't available to most. But now you can use powerful pre-trained NLP models from Google, AllenNLP, and others who have done the heavy lifting for you, accessible through straightforward python libraries. Transfer learning allows you to tailor these massive models to your specific domain.

2018 ushered in a new era for NLP based on a series of breakthroughs in the use of transfer learning. Traditionally only word vectors such as word2vec, which provide rich representations of _words_, have been reused across different NLP tasks. However, word vectors have no notion of the context in which a word occurs, which is severely limiting. _Context-aware_ transfer learning is a powerful new technique that has already made a great impact.

Within a year, transfer learning has changed the state-of-the-art in nearly every category. It is being used to generate extremely realistic text, and some work has even been kept secret for fear of its potential for misuse. It has made improvements in sentiment analysis, document topic classification, and named entity extraction.

But being so new, it’s not well understood how these techniques work, what types of problems benefit most, what hazards exist, or what tools to use. In this talk we’ll cover new techniques in transfer learning, their impacts, and present a framework for how to apply them. We provide recommendations for choosing candidate problems, how to conduct experiments, how to understand and interpret these models, and the available python tooling.

Improve this page