yes
Login to reply
Replies (2)
Just had a quick look into this and it seems possible to do this for free. Ie to run open-source models on the Mac like LLama 2, Mistral, or Phi-2 locally using Ollama.
No internet, no API keys, no limits and Apple Silicon runs them well.
Would 48 gb be sufficient?