NEW Mixtral 8x22b Tested - Mistral's New Flagship MoE Open-Source Model

Опубликовано: 01 Ноябрь 2024
на канале: Matthew Berman
55,826
1.8k

Mistral AI just launched Mixtral 8x22, a massive MoE open-source model that is topping benchmarks. Let's test it!

Be sure to check out Pinecone for all your Vector DB needs: https://www.pinecone.io/

Join My Newsletter for Regular AI Updates 👇🏼
https://www.matthewberman.com

Need AI Consulting? ✅
https://forwardfuture.ai/

My Links 🔗
👉🏻 Subscribe:    / @matthew_berman  
👉🏻 Twitter:   / matthewberman  
👉🏻 Discord:   / discord  
👉🏻 Patreon:   / matthewberman  

Media/Sponsorship Inquiries 📈
https://bit.ly/44TC45V

Links:
LLM Leaderboard - https://bit.ly/3qHV0X7
Mixtral Model - https://huggingface.co/lightblue/Kara...