Description
We introduce the Lightning AI open source stack, a high-performance stack for training, fine-tuning, and deploying AI systems that augments the PyTorch ecosystem.
Today PyTorch Lightning powers training workloads across the industry, from small-scale research to large-scale training endeavors. The package has reached 130M total downloads in June 2024, 2x since early 2023. PyTorch Lightning 2.4 features support for 2D parallelism via DTensors, first introduced in PyTorch 2.3.
The open source stack is completed by Fabric (lightweight building blocks for scaling training workloads), LitGPT (library for pre-training, fine-tuning, serving LLMs), LitData (parallel data processing and streaming data loading), LitServe (lightweight, high-performance serving framework), TorchMetrics (de-facto standard in deep learning metrics), and the recently released Thunder compiler. Together, these packages provide a low-friction, high-performance stack to democratize and accelerate the AI lifecycle.
The stack is optimized to run on Lightning Studios, a PyTorch native, fully integrated AI development environment on the cloud.