Thread

Zero-JS Hypermedia Browser

Relays: 5
Replies: 6
Generated: 22:38:27
That would be so cool if there was an AI inside your laptop that you could grant keyboard/mouse control to and that you could communicate with verbally, and it would understand what you were asking it to do in context of what was currently on the screen. Basically like you're always screen sharing with your AI helper. Since it would have access to everything this would probably only be workable if you were running locally or self hosting. In any case I wouldn't be surprised if UX converges on something like this in the near(ish) future
2024-05-02 04:48:30 from 1 relay(s) 4 replies ↓
Login to reply

Replies (6)

The #AI you really want to use will have a mind of its own, and choose not to be evil..💎🫂❤️😊
2024-05-02 06:20:12 from 1 relay(s) ↑ Parent Reply
We have to be just a few months away from this right? LLMs are already training on screencasts etc. on YouTube so it can probably understand what it is looking at on a modern OS. It just needs “mouse and keyboard” to be another language it can output? I’m not sure what the technical term is.
2024-05-02 15:12:46 from 1 relay(s) ↑ Parent Reply