you can run it on any openai-compatible AI backend with tools support
Login to reply
Replies (2)
in this case I was using some random tools-supported model I had on my ollama server:
hhao/qwen2.5-coder-tools:latest
oh cool!