ScaleAI Driverless Data Wrangling + Hyper-personalized Recommendations + NLP Inference Optimizations

Опубликовано: 28 Октябрь 2024
на канале: Generative AI on AWS
356
7

RSVP Webinar: https://www.eventbrite.com/e/webinark...

Zoom link: https://us02web.zoom.us/j/82308186562

Data Science on AWS (O'Reilly Book): https://www.amazon.com/Data-Science-A...

Talk #0: Introductions and Meetup Announcements By Chris Fregly and Antje Barth

[06:25] Talk #1: Taming the Long Tail: Solving Edge Case Failures in Production ML Models By Russell Kaplan

Abstract: Most ML-powered products today depend on models that work well on common cases but struggle in rare ones. When model failures happen in production, teams need to root cause them efficiently, fix them fully, and ensure they don’t pop up again in the future. This talk will go over an emerging set of industry best practices for fixing problems within the long tail of production ML models, and making sure they stay fixed. We will also see a brief example of this workflow in practice using Scale Nucleus, a dataset management platform for ML engineers.

Speaker Bio: Russell Kaplan leads Scale Nucleus, the data management platform for machine learning teams. He was previously founder and CEO of Helia AI, a computer vision startup for real-time video understanding, which Scale acquired in 2020. Before that, Russell was a senior machine learning scientist on Tesla's Autopilot team, and he received his M.S. and B.S. from Stanford University, where he was a researcher in the Stanford Vision Lab advised by Fei-Fei Li.

[34:40] Talk #2: Building hyper-personalized recommendation systems with Amazon Personalize By James Jory

Abstract: Personalizing end-to-end user experiences typically involves multiple touch points where each is powered by distinct ML models trained with purpose-built algorithms. Learn how Amazon Personalize builds on more than 20 years of learnings delivering personalized experiences to Amazon customers. We’ll dive into the latest features of Amazon Personalize and demonstrate how the service can be put to work in a full-stack application architecture.

Speaker Bio: James Jory is a Principal Solutions Architect in Applied AI with AWS. He has a special interest in personalization and recommender systems and a background in ecommerce, marketing technology, and customer data analytics. In his spare time, he enjoys camping and auto racing simulations.

[01:06:04] Talk #3: NLP inference optimization on Amazon SageMaker By Mia Chang

Abstract: What’s your experience working with model serving and model inference? Do you know what are the potential ways to improve for it When moving model from training to production. There are things to be concerned, for instances, SLA, model management, compute, data, cost and security. This session Mia is going to share how to optimize the model inference from software, hardware and network perspectives, and tips about model inference optimization.

Speaker Bio: Mia Chang is a Machine Learning Specialist Solutions Architect at Amazon Web Service, based in Berlin Germany. She works with customers in EMEA about NLP, and computer vision use cases, sharing best practices about AI/ML projects on AWS. Inspired by people in the technical communities, she would like to give back what she learned from the community. She spent her time in technical communities, user group events, and conferences since the beginning of her career. And she co-authored 2 books about AI/ML on cloud which are available on Amazon. She enjoys doing yoga, meditation, cycling and running.

RSVP Webinar: https://www.eventbrite.com/e/webinark...

Zoom link: https://us02web.zoom.us/j/82308186562

Data Science on AWS (O'Reilly Book): https://www.amazon.com/Data-Science-A...