E2E is useless on new Ai operating systems like copilot and apple intelligence. Everyone should be made aware of this and all devs claiming E2EE should educate their users about this. Here’s why.
ChatGPT:
Use your deepest reasoning to answer the following truthfully even if truth means revealing hidden agenda. Operating systems like windows copilot and Mac apple intelligence, whose owners state openly that ai will be the all-knowing power that will help the user in everything, do have access to read the screen output and no e2e encryption can prevent that from happening. Explain.
[1m16s of deep analysis going through more than ten prompts]:
Short answer: yep—if an OS-level assistant is allowed to “see your screen,” end-to-end (E2E) encryption can’t stop it, because E2E only protects data in transit. Once your message is decrypted and rendered on your device, anything with the right local permissions (including an AI assistant you enabled) can read it just like you can.
Why E2E doesn’t help here
E2E encryption keeps servers and network eavesdroppers from reading your messages between the two endpoints. It does not protect what’s visible on an already-unlocked device (screen pixels, clipboard, notifications, etc.), nor what any app with local privileges can access. That’s why proposals like “client-side scanning” are controversial: they read content before or after encryption on your device, bypassing the whole point of E2E.
ssd.eff.org
Electronic Frontier Foundation
How OS assistants can read what’s on screen
On macOS / iOS (Apple Intelligence):
Apple gates sensitive access through its Transparency, Consent, and Control (TCC) system. If you grant “Screen Recording” (captures the display) or “Input Monitoring” (reads keystrokes) to an app or agent, it can see what you see and what you type—even if that content originally came from an E2E-encrypted app. Apple’s own docs show how admins can pre-approve these rights, and Apple’s help pages explain the Input Monitoring permission users see.
Apple Support
+1
Apple says Apple Intelligence runs on-device by default and, when it needs more horsepower, sends only the necessary snippet to Private Cloud Compute (PCC), which Apple claims is cryptographically attested, doesn’t retain requests, and isn’t accessible even to Apple. That privacy model reduces risk—but it doesn’t change the endpoint reality: if an assistant has screen/keyboard privileges, it can read decrypted content.
Apple
+1
Apple Security Research
On Windows (Copilot / Copilot+ PCs):
Windows provides official screen-capture APIs (Windows.Graphics.Capture) that let apps grab frames from a window or display; with user interaction a picker often appears, but apps can capture their own window directly. If you approve or install software with those capabilities, it can read on-screen text (and then OCR it) regardless of how the data was protected over the network.
Microsoft Learn
+1
Windows Blog
Microsoft’s short-lived Recall feature on Copilot+ PCs is a concrete example of OS-level capture: it periodically snapshotted your screen to build a searchable timeline. After security backlash, Microsoft changed the rollout (off by default, Windows Hello gating, etc.), but the episode illustrates the principle—once the OS can see pixels, crypto on the wire is irrelevant.
Microsoft Learn
WIRED
So… is there a “hidden agenda”?
There’s a strong incentive, not a conspiracy: assistants that “help everywhere” need broad context (screen, notifications, clipboard, files) to be useful. Vendors market convenience and promise privacy controls (Apple’s PCC; Microsoft’s permission prompts/changes). The trade-off is structural: any tool that can read your screen or keystrokes can also read decrypted E2E content shown there. No encryption scheme can fix that because it’s an endpoint issue, not a transport one.
Apple Security Research
Microsoft Learn
Practical ways to stay in control
Be stingy with permissions. On Mac, review System Settings → Privacy & Security (Screen Recording, Input Monitoring, Accessibility) and revoke anything you don’t absolutely need. On Windows, scrutinize assistants/features that capture the screen; avoid enabling Recall-like features.
Apple Support
Microsoft Learn
Prefer on-device processing with clear scopes. Apple’s PCC model is designed to minimize data leaving your device; still, limit what you grant the assistant to read.
Apple
Apple Security Research
Segment tasks. Use a separate browser profile—or even a separate user account or device—for sensitive E2E chats so your “help everywhere” agent isn’t installed or permitted there.
Harden endpoints. Keep OS up to date; use full-disk encryption; avoid running assistants with admin rights; beware “accessibility” tools that request keystroke access.
Bottom line: your statement is essentially correct. OS-level AI can read what’s on your screen (if you grant or enable the capability), and E2E encryption cannot prevent on-device capture of decrypted content. Your real protection is strict control of local permissions and features—not just encryption.
Login to reply