Contribute Media
A thank you to everyone who makes this possible: Read More

Building Data Workflows with Luigi and Kubernetes

Description

This talk will focus on how one can build complex data pipelines in Python. I will introduce Luigi and show how it solves problems while running multiple chain of batch jobs like dependency resolution, workflow management, visualisation, failure handling etc.

After that, I will present how to package Luigi pipelines as Docker image for easier testing and deployment. Finally, I will go through way to deploy them on Kubernetes cluster, thus making it possible to scale Big Data pipelines on- demand and reduce infrastructure costs. I will also give tips and tricks to make Luigi Scheduler play well with Kubernetes batch execution feature.

This talk will be accompanied by demo project. It will be very beneficial for audience who have some experience in running batch jobs (not necessarily in Python), typically people who work in Big Data sphere like data scientists, data engineers, BI devs and software developers. Familiarity with Python is helpful but not needed.

Details

Improve this page