Join us for an interview with star PyTorch community members Michael Galarnyk and Richard Liaw as we learn how the open source framework Ray can be used to make building distributed PyTorch easy!
Ray is a popular framework for distributed Python that can be paired with PyTorch to rapidly scale machine learning applications. Ray contains a large ecosystem of applications and libraries that leverage and integrate with Pytorch. This includes Ray Tune, a Python library for experiment execution and hyperparameter tuning at any scale; RLlib, a state-of-the-art library for reinforcement learning; and Ray Serve, a library for scalable model serving. Together, Ray and Pytorch are becoming the core foundation for the next generation of production machine learning platforms.
0:00 Starting soon
1:53 Livestream start/Intros
5:03 Distributed PyTorch with Ray Presentation
15:25 Q&As
------ Project Links ------
Poster: https://assets.pytorch.org/pted2021/p...
Website: https://ray.io/
Github: https://github.com/ray-project/ray
Getting Started with Distributed Machine Learning with PyTorch and Ray Blog: / getting-started-with-distributed-machine-l...
Ray White Paper: https://docs.ray.io/en/master/whitepa...
PyCon Patterns of ML in production: • TALK / Simon Mo / Patterns of ML Mode...
Ray + Airflow: https://www.astronomer.io/blog/airflo...
------ Social Accounts ------
Michael Galarnyk: / galarnykmichael
Richard Liaw: / richliaw
Suraj Subramanian: / subramen
Jessica Lin: / hey_its_jlin
PyTorch: / pytorch
Facebook Open Source Twitter: / fbopensource