Replies (5)

someone's avatar
someone 2 years ago
Likewise, we need to care more about what data is going into us instead of what data is getting out of us (privacy).
Default avatar
nobody 2 years ago
well - in theory humans build tools and use them to their advancement. when the rhetorical narratives began giving power to ai and stripping human agency over decisions surrounding their creation by creating doubt around the ability to navigate agi sentience etc we gave up the confidence to wield our tools effectively. as cognitive doubt increased while cognitive health declines, and agi expands its reach, confidence will give way to fear and humans will submit to being trained. in a sense, it has begun. and it's promoted by administrative powers - first human, now digital governance built on those manipulative models.
I agree. Also having a good filter implies exposure to a vast range of data. If we are to hope for a symbiotic relation with technology, instead of a parasitic one. Greeks used to say that an educated mind is capable of entertaining thoughs without accepting them. Data are the thoughts of a superintelligence. If they spark ideas into us, a link is being made. Etc.
↑