Contribute Media
A thank you to everyone who makes this possible: Read More

Composable Distributed PT2(D)

Description

In this session, we will explore the technology advancements of PyTorch Distributed, and dive into the details of how multi-dimensional parallelism is made possible to train Large Language Models by composing different PyTorch native distributed training APIs.

Details

Improve this page