Not on the Mac mini, it's now even running OpenClaw, this is a new workstation for me.
But yes, I have local models running under Ollama on my Fedora workstation, which my OpenClaw uses for basic tasks using subagents.
I'm still using ChatGPT via OAuth for the heavy lifting though.
My OpenClaw is running on a Raspberry Pi with very limited agentic abilities outside its own computer. I trust agentic browsers like comet more because they are limited to operating within the confines of the browser.
Login to reply
Replies (1)
Yeah, I’m running a similar setup , but trying with some free models through openrouter for the “easy stuff”
And I’ve been asking myself if there is any point to local models unless you got heavy duty hardware