Contribute Media
A thank you to everyone who makes this possible: Read More

Building quality data pipelines


Working with data can be daunting when making code changes. As we make transformational changes to our data, how do we know that the quality of our data is preserved? Performing these transformations on large datasets can be even more daunting, as the number of areas that could have been affected. The number of areas to troubleshoot can be numerous, making it time- consuming to identify the root cause to be fixed.

Instead of going through a manual checklist of data points and features to check, this talk will walk through a framework for how you can readily apply testing to ensure that any data transformations we make are accurate, and

This talk is for anyone working with datasets where maintaining the quality of our data i


Improve this page