Description
Filmed at PyData London 2017
Description Chatbots are all the hype right now, like it or not. In this talk I want to take a look under the hood and show you how simple it is today to incorporate sophisticated language understanding in your applications. The tool of choice are distributed representations, also known as word vectors, which allow us to answer the most crucial question: which of these words mean similar things?
Abstract It is hard to go anywhere on the web these days without encountering chatbots and other natural language interfaces. But how do these bots actually understand what you say? It turns out it can be boiled down to a simple recipe: you need to know which words mean similar things! This sounds straight-forward, but efficient ways of doing this, namely distributed representations, were only just discovered in the past few years of machine learning research. They have immense potential and we are only beginning to realise what we can do with them.
In this talk I want to outline how distributed representations are used at babylon health, and how everyone can incorporate sophisticated language understanding into their applications with just a few lines of python. Furthermore I will give a glimpse of the research we are doing, some of which we just published in a paper at ICLR.