GPT-like large language models can run on your laptop. Meta’s LLaMa was quickly jumped by the open-source community and enhanced to be something more. After some clever engineering, you can run the 65 billion parameter model on an M1 mac with a ton of ram and get performance close to GPT-3. Thanks Georgi. After some more clever engineering, you can install an npm package. After some MORE clever engineering, you run the 7 billion parameter version and get performance close to GPT-3 and ChatGPT, on a Raspberry-Pi.
Current Sub Count: 8,110
Business Email: [email protected]
Patreon: / siddhantdubey
🤖 Join my discord server: / discord
📸 Instagram - / sid12.dubey
🐦 Twitter - / sidcodes
💻 GitHub - https://github.com/siddhantdubey/
🎵 Follow my TikTok: / sidcodes
Music In This Video:
Epidemic Sound: https://share.epidemicsound.com/02ed67
(I do get benefits from the above link)
WHO AM I?
I'm a second year at Georgia Tech majoring in Computer Science. On this channel, I make videos related to machine learning and programming, lifestyle vlogs, productivity tip videos, and more! If you're interested in that, be sure to hit the subscribe button and leave a like on this video! Subscribe here: http://youtube.com/c/SiddhantDubey/?s...