Pre-train Mixtral MoE model on SageMaker HyperPod + SLURM + Fine-Tuning + Continued Pre-Training

Опубликовано: 20 Январь 2025
на канале: Generative AI on AWS
1,644
21

Talk #0: Introduction + Mixtral Mixture of Experts (MoE) model + SLURM Overview
by Chris Fregly (Principal Solution Architect, Generative AI) and Antje Barth (Principal Developer Advocate, Generative AI)

Talk #1: Train the Mixtral MoE foundation model on SLURM with SageMaker HyperPod
by Ben Snyder, Applied Scientist @ AWS

Talk #2: Instruction Fine-Tuning and Continued Pre-training
by Antje Barth and Chris Fregly

RSVP Webinar: https://www.eventbrite.com/e/webinar-...

Zoom link: https://us02web.zoom.us/j/82308186562

Related Links

O'Reilly Book: https://www.amazon.com/Generative-AWS...
Website: https://generativeaionaws.com
Meetup: https://meetup.generativeaionaws.com
GitHub Repo: https://github.com/generative-ai-on-aws/
YouTube: https://youtube.generativeaionaws.com