That is to train them. You need more like a few thousand dollars of graphics cards to max out LLM performance for a single user running today's existing models. And we're starting to get pretty useful stuff in the 4-32GB range (typical consumer devices)
Login to reply
Replies (2)
Meant 4-32GB of memory, forgot to specify
Yeah that's more what I assumed it would be like by now.