Just had a quick look into this and it seems possible to do this for free. Ie to run open-source models on the Mac like LLama 2, Mistral, or Phi-2 locally using Ollama.
No internet, no API keys, no limits and Apple Silicon runs them well.
Login to reply
Replies (1)
you can even use dave with a setup like this. fully private local ai assistant that can find and summarize notes for you
View quoted note →