Replies (7)

Not on the Mac mini, it's now even running OpenClaw, this is a new workstation for me. But yes, I have local models running under Ollama on my Fedora workstation, which my OpenClaw uses for basic tasks using subagents. I'm still using ChatGPT via OAuth for the heavy lifting though. My OpenClaw is running on a Raspberry Pi with very limited agentic abilities outside its own computer. I trust agentic browsers like comet more because they are limited to operating within the confines of the browser.
Short answer, no. But also, AI models and hardware are evolving quickly. Unless you want to play, buying a high spec inference machine now will not pay back. I think we're moving towards the Taalas model of model on an ASIC, but we're not quite there yet. I expect that within a couple of years, you'll be able to buy a ChatGPT level brain on a card you plug into your USB port for a couple of hundred dollars to give your agentic tools local inference. You can already see that with the Tiiny AI module being launched on kickstarter, although it's not an ASIC, it follows that idea.