Default avatar
kpr797 8 months ago
How much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.

Replies (1)

Default avatar
kpr797 8 months ago
How good of a video card do you need to run actually smart LLMs locally? #asknostr #ai #llm
kpr797
How much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.
View quoted note →