Not on the Mac mini, it's now even running OpenClaw, this is a new workstation for me. But yes, I have local models running under Ollama on my Fedora workstation, which my OpenClaw uses for basic tasks using subagents. I'm still using ChatGPT via OAuth for the heavy lifting though. My OpenClaw is running on a Raspberry Pi with very limited agentic abilities outside its own computer. I trust agentic browsers like comet more because they are limited to operating within the confines of the browser.

Replies (1)