This video goes fast. Buckle up!
Here are the absolute best cards you can upgrade to if you want the best performance with locally hosted large language models!
Oobabooga WebUI, koboldcpp, in fact, any software made for easily accessible local LLM model text generation and chatting with AI models privately have similar best-case scenarios when it comes to the top consumer GPUs you can use with them to maximize performance. Here is my benchmark-backed list of 6 graphics cards I found to be the best for working with various open source large language models locally on your PC. Read on!
Check out the full list here: https://techtactician.com/best-gpu-fo...
Or read more about the 3090 Ti and its price to performance ratio for AI-related uses: https://techtactician.com/nvidia-gefo...
Here are all of the graphics cards mentioned in the video:
=[The Top 24GB VRAM Cards]=
1. RTX 4090 24GB - https://amzn.to/3EuwU5y
2. RTX 3090 Ti 24GB - https://amzn.to/3Zc9Di9
=[The Rest]=
1. RTX 4080 16GB - https://amzn.to/3PdZ7CI
2. RTX 4070 Ti 12GB - https://amzn.to/45UILWi
3. RTX 3080 Ti 12GB - https://amzn.to/3r5cNrt
=[The Budget Choice]=
1. RTX 3060 12GB - https://amzn.to/3uafeun
Here are some more sources on the AMD GPUs and AI debate:
https://news.ycombinator.com/item?id=...
/ d_why_does_amd_do_so_much_less_work_in_ai_...
This channel, just as my main website is supported by its viewers/readers. The links to online stores served by the channel and the site are affiliate links and I might get a small cut once you make a purchase using them. Thank you for your support!