This changes everything.
Introducing Maple Proxy – a powerful bridge that lets you call Maple’s end‑to‑end‑encrypted LLMs with any OpenAI‑compatible client, without changing a line of code. https://blossom.primal.net/19887ff7b4b7c95e1b195c5b7790f9457035a75cb5e6690dab0129fc4cb2dc2c.mp4
Read the full announcement or check out the thread
https://blog.trymaple.ai/introducing-maple-proxy-the-maple-ai-api-that-brings-encrypted-llms-to-your-openai-apps/
Login to reply
Replies (32)
Why it matters: Most AI‑powered apps (calorie trackers, CRMs, study companions, relationship apps eyes 👀) send every prompt to OpenAI, exposing all user data. With Maple Proxy, that data never leaves a hardware‑isolated enclave.
https://blossom.primal.net/21b4ee5b782d240d3eb06c3db41ffa73ec79734f104ee94113ff2dc6d7e771c3.mp4
How it works:
🔒 Inference runs inside a Trusted Execution Environment (TEE).
🔐 End‑to‑end encryption keeps prompts and responses private.
✅ Cryptographic attestation proves you’re talking to genuine secure hardware.
🚫 Zero data retention – no logs, no training data.
Ready‑to‑use models (pay‑as‑you‑go, per million tokens):
- llama3‑3‑70b – general reasoning
- gpt‑oss‑120b – creative chat
- deepseek‑r1‑0528 – advanced math & coding
- mistral‑small‑3‑1‑24b – conversational agents
- qwen2‑5‑72b – multilingual work & coding
- qwen3‑coder‑480b – specialized coding assistant
- gemma‑3‑27b‑it‑fp8‑dynamic – fast image analysis
Real‑world use cases:
🗓️ A calorie‑counting app replaces public OpenAI calls with Maple Proxy, delivering personalized meal plans while keeping dietary data private.
📚 A startup’s internal knowledge‑base search runs through the proxy, so confidential architecture details never leave the enclave.
👩💻 A coding‑assistant plug‑in for any IDE points to http://localhost:8080/v1 and suggests code, refactors, and explains errors without exposing proprietary code
Getting started is simple:
Desktop app (fastest for local dev)
- Download from trymaple.ai/downloads
- Sign up for a Pro/Team/Max plan (starts at $20/mo)
- Purchase $10+ of credits
- Click “Start Proxy” → API key & localhost endpoint are ready.
Docker image (production‑ready)
- `docker pull ghcr.io/opensecretcloud/maple-proxy:latest`
- Run with your MAPLE_API_KEY and MAPLE_BACKEND_URL
- You now have a secure OpenAI‑compatible endpoint at http://localhost:8080/v1
Compatibility: Any library that lets you set a base URL works—LangChain, LlamaIndex, Amp, Open Interpreter, Goose, Jan, and virtually every OpenAI‑compatible SDK.
Need more detail? Check the technical write‑up, full API reference on GitHub, or join the Discord community for real‑time help.
https://blog.trymaple.ai/maple-proxy-documentation/
Start building with private AI today: download the app or pull the Docker image, upgrade to a plan, add a few dollars of credits, point your client to http://localhost:8080/v1, and secure all your apps.
NICE
nostr:nevent1qqs0wkhgplenzzruh8c5twvht6pngrunlag3eqhp2mlm0c3tm2uhesgpzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgug20ck
Love it
So now MapleAI via routstr paid with ecash instead of a login?
not the same type of proxy but would be great! nostr:nprofile1qqs8msutuusu385l6wpdzf2473d2zlh750yfayfseqwryr6mfazqvmgpz3mhxue69uhhyetvv9ukzcnvv5hx7un89uqnqamnwvaz7tmfw33ks7fdvahkcer9deex7epdve6hycm0de6xsmmsdpskwatn9eekxctjv93zu6td9ufpdr9u should run a routstr proxy
I'm insanely bullish on confidential compute and nostr:nprofile1qyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucpxpmhxue69uhkjarrdpuj6em0d3jx2mnjdajz6en4wf3k7mn5dphhq6rpva6hxtnnvdshyctz9e5k6tcqyp7u8zl8y8yfa87nstgj2405t2shal4rez0fzvxgrseq7k60gsrx6zeuh5t / nostr:nprofile1qyshwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0w3uhqetnvdexjur5qyfhwumn8ghj7ur4wfcxcetsv9njuetn9uqzphchxfm3ste32hfhkvczzxapme9gz5qvqtget6tylyd7wa8vjecgaq9enl
AI will provide a tremendous amount of value for the world. But it will also allow for mind manipulation on steroids that makes what happened with web 2.0 cos like Facebook look tame. But there's now an alternative to the closed source mega models and hosting - OpenSecret / MapleAI effectively offer developers a publicly accessible (and open source) version of Apple's encrypted inference cloud: https://security.apple.com/blog/private-cloud-compute/
Thank god for cryptography, the work of people like Anthony / nostr:nprofile1qydhwumn8ghj7mnjv4kxz7fwvvkhxar9d3kxzu3wdejhgtcprpmhxue69uhhqun9d45h2mfwwpexjmtpdshxuet5qqsgafy9ye4j9p2x8vfmlq6equtpcg4m8ks7v545g0d3f7wwueeq5scz9g5a6, and organizations like Apple
nostr:nevent1qvzqqqqqqypzqlwr30njrjy7nlfc95f92h694gt7l63u3853xrypcvs0td85gpndqyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucpr4mhxue69uhkummnw3ez6vp39eukz6mfdphkumn99e3k7mf0qywhwumn8ghj7mn0wd68ytfsxgh8jcttd95x7mnwv5hxxmmd9uqzpa66aq8lxvgg0julz3deja0gxdq0j0l4z8yzu9t0ldlz90dtjlxpk9kxgm
This is a pretty big deal
nostr:nevent1qvzqqqqqqypzqlwr30njrjy7nlfc95f92h694gt7l63u3853xrypcvs0td85gpndqyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucpr4mhxue69uhkummnw3ez6vp39eukz6mfdphkumn99e3k7mf0qqs0wkhgplenzzruh8c5twvht6pngrunlag3eqhp2mlm0c3tm2uhesgnmn3rc
Wen NWC payment option? I just want to let it rip with a budget.
Hey nostr:nprofile1qqsdy27dk8f9qk7qvrm94pkdtus9xtk970jpcp4w48k6cw0khfm06mspz4mhxue69uhkummnw3ezummcw3ezuer9wchsz9mhwden5te0wfjkccte9ehx7um5wghxyctwvshsh9tjpk reaching out on here as well. Can we make something happen to get Maple encrypted LLMs available on PPQ?
nostr:nevent1qqs0wkhgplenzzruh8c5twvht6pngrunlag3eqhp2mlm0c3tm2uhesgpzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgug20ck
Routstr doesn’t speak OpenAI API?
Is PPQ closed source?
It does in that it can use openai upstream and issue api keys from the routstr client to use in other openai clients.
I think Maple may sit more as an upstream provider from a rouststr perspective. Like that my routstr provider would use maple as an upstream provider and users would query my provider, pay in cashu and receive the response from maple via my proxy.
Maple AI could run a routstr proxy and offer no account kyc free access to maple.. but all I see is that maple could add nostr login and save encrypted chats to nostr relays like routstr is doing now.
Just spit balling what I think is the configuration.
so Maple + goose = ❤?
it does but we need end to end encryption so the routstr node should not touch the unencrypted part, and it would be sick if marple hosts a routstr node to serve its own models
i think so
Run this in Goose 👀
nostr:nevent1qqs0wkhgplenzzruh8c5twvht6pngrunlag3eqhp2mlm0c3tm2uhesgpzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgug20ck
That’s because OpenAI’s open weight is available on AWS now, right?
We have the OpenAI open-source gpt-oss-120b model running in a secure GPU. That gives users end-to-end encryption. It is comparable to the GPT-4 family of models.
yep
legendary! having not read anything about this proxy, are there docs specifically for the goose scenario?
Hey guys, this is great, thank you for building.
Can you please enable monthly bitcoin payments.
Incredible. Big unlock for privacy focused AI
nostr:nevent1qqs0wkhgplenzzruh8c5twvht6pngrunlag3eqhp2mlm0c3tm2uhesgpzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgug20ck
LFG 🔥🚀
∞
nostr:nevent1qvzqqqqqqypzqlwr30njrjy7nlfc95f92h694gt7l63u3853xrypcvs0td85gpndqyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucpr4mhxue69uhkummnw3ez6vp39eukz6mfdphkumn99e3k7mf0qqs0wkhgplenzzruh8c5twvht6pngrunlag3eqhp2mlm0c3tm2uhesgnmn3rc I need to try Maple soon 👀
Well worth the annual subscription!! Love the product
Interesting use of TEEs!
Do I understand that correctly: I need to be a pro subscriber and than purchase API credits on top?!
I love the zero knowledge approach you guys have, but for the occasional document analysis I can't justify 20$/month. Would gladly pay twice what other models charge on a pay-per-token basis like on ppq.ai
Yes
What about Claudo Code?
Some info below, not sure if it’s 100% private
https://github.com/anthropics/claude-code/issues/7178?utm_source=chatgpt.com