qwen3-coder-next is really good! all you need is 50GB vram, i pray for the day that we can all run our coding agents locally
Login to reply
Replies (13)
How big is it? 80B?
Yep
It will take a couple of years with the current RAM-flation.
Also I would hope for a China free model, but that is option.
There’s currently no open source LLM capable of handling decent coding tasks ? :o
Should work easily on a maxed out MacBook Pro then with 128GB shared memory.
me too!
don’t give me hope
Gotta try and see some hope through the dystopian doom fog
I got 16GB mate, offloading to system RAM slows it significantly
Just got to pick up one of these baddies 😬


Stay humble, Stack RTX 6000s
💯 too bad the price went up like crazy since RAM shortage