Description
NLP applications and LLMs have incredible potential to transform the way we live, work, and play. But NLP applications, especially at production scale, can be confusing. In this talk, we’ll explore how thinking of NLP applications as graphs can reduce confusion and help you build and customize quickly.
We will first look into what makes up some of the most common NLP applications today, such as retrieval-augmented generation (RAG), and how each step of these applications can be represented as a node in a graph. Then, we will see how we can incorporate branches and loops into these applications.
As a final step, we’ll see how we can build customized tooling for NLP applications in Python.
We will make use of the pipeline structure of Haystack (an open-source LLM framework in Python) as the basis for the examples. We will cover two working examples that use Haystack’s custom component API (nodes of the graph) within a full pipeline:
- A private Notion question-answering app
- A summarizer for the latest Hacker News posts
Key takeaways:
- It’s useful to think of NLP applications as directed (multi-)graphs
- There are tools for Python developers to build their own tooling that slots into this architecture with the use of open-source frameworks like Haystack.