How much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.
Login to reply
Replies (1)
How good of a video card do you need to run actually smart LLMs locally? #asknostr #ai #llm
How much video RAM is needed to run a version of the models that are actually smart though? I tried the Deepseek model that fits within 8 GB of video RAM, and it was basically unusable.
View quoted note →