Default avatar
kpr797 8 months ago
How good of a video card do you need to run actually smart LLMs locally? #asknostr #ai #llm
kpr797
How much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.
View quoted note →

Replies (2)

Idk. Not even the stuff running in massive data centers is "actually smart," but a good APU can do stuff even with the integrated GPU, and then adding more RAM probably helps more than adding more GPU
I run them on my 3090 with KoboldCpp. Modern 30B's are pretty smart. But to have fun with an LLM you dont need a good gpu. 6GB of vram is enough to have fun and otherwise theres free places such as colab that can run KoboldCpp also.
โ†‘