Description
Enabling AI is very energy-consuming. Nowadays, many companies turn their attention toward productionizing machine learning models. However, the energy efficiency of the models has not been considered as one of the main quality requirements. In this talk, we outline some energy efficiency guidelines that you can start using today!
Deploying machine learning models into production can be very energy costly. For instance, training and hyper-parameter tuning utilize quite some hardware resources for quite long time intervals. Yet, most companies do not consider energy efficiency metrics as one of the main quality attributes of machine learning pipelines.
In this talk, we walk you through some example machine learning pipelines, which we use to show a couple of energy efficiency approaches. We explore inefficiency sweet spots in the life cycle of the models by looking at the utilization rates of the cloud resources. Finally, we evaluate the effectiveness of our energy efficiency approaches with regards to other efficiency metrics such as performance.
Data scientists, machine learning engineers, and data engineers can benefit from this talk by reusing the guidelines in similar use cases. This talk aims to emphasize the importance of "energy efficiency by design" in the data science field.