Talk #0: Introduction + Mixtral Mixture of Experts (MoE) model + SLURM Overview
by Chris Fregly (Principal Solution Architect, Generative AI) and Antje Barth (Principal Developer Advocate, Generative AI)
Talk #1: Train the Mixtral MoE foundation model on SLURM with SageMaker HyperPod
by Ben Snyder, Applied Scientist @ AWS
Talk #2: Instruction Fine-Tuning and Continued Pre-training
by Antje Barth and Chris Fregly
RSVP Webinar: https://www.eventbrite.com/e/webinar-...
Zoom link: https://us02web.zoom.us/j/82308186562
Related Links
O'Reilly Book: https://www.amazon.com/Generative-AWS...
Website: https://generativeaionaws.com
Meetup: https://meetup.generativeaionaws.com
GitHub Repo: https://github.com/generative-ai-on-aws/
YouTube: https://youtube.generativeaionaws.com