That would be so cool if there was an AI inside your laptop that you could grant keyboard/mouse control to and that you could communicate with verbally, and it would understand what you were asking it to do in context of what was currently on the screen. Basically like you're always screen sharing with your AI helper. Since it would have access to everything this would probably only be workable if you were running locally or self hosting. In any case I wouldn't be surprised if UX converges on something like this in the near(ish) future
Login to reply
Replies (6)
Microsoft has that now pretty much except that your data is shared with 785 partners.
Oh really what’s it called?
Ok this Microsoft version seems creepy and useless haha
The #AI you really want to use will have a mind of its own, and choose not to be evil..💎🫂❤️😊
We have to be just a few months away from this right? LLMs are already training on screencasts etc. on YouTube so it can probably understand what it is looking at on a modern OS. It just needs “mouse and keyboard” to be another language it can output? I’m not sure what the technical term is.