Replies (1)

Short answer, no. But also, AI models and hardware are evolving quickly. Unless you want to play, buying a high spec inference machine now will not pay back. I think we're moving towards the Taalas model of model on an ASIC, but we're not quite there yet. I expect that within a couple of years, you'll be able to buy a ChatGPT level brain on a card you plug into your USB port for a couple of hundred dollars to give your agentic tools local inference. You can already see that with the Tiiny AI module being launched on kickstarter, although it's not an ASIC, it follows that idea.