my take is that, unless you want to develop your own LLMs, I would wait for the consumer architecture to stabilize instead of being an early adopter and spending too much on a local rig.
Login to reply
Replies (2)
training your own LLMs on your own specific dataset is the only usecase I would suggest building a local rig for
its hard to have too much sovereign compute tho