Replies (36)

We are all going to have our own suite of custom apps and it's going to be πŸ”₯. The dev talent is going to wander over to app-generators. Don't need to use the same apps, if the data (including user settings!) are standardized and stored in events. Then apps are just more-or-less event viewers. Once paper and pen became cheap enough, the only thing of value was the words being written.
In the future, you would either need to: 1) build something so sprawling or complex, that AI would have trouble copying it because of the QA, architecture, and maintenance involved 2) build something that is meant to be used as part of a larger suite, including services and communities 3) build a specialized-app generator 4) take up golfing
I'm saying that the only thing the user owns, is the data and application/relay settings stored in events. If they keep it in events, then they can move from relay to app, and have the same setup, everywhere. So, you would choose your relay based upon service, including whether they honor what is in your events. For instance, @cloud fodder and 121 others allows me to use my npub's lists (mute, follows, favorites, etc.) to determine who is allowed to use my private relay. I edit those lists and click a button in relay.tools and it updates the blocks/allows. That means I could have 15 different relays, from different providers and/or self-hosted, and they'd all have the same settings, if I choose. Or I could have a set of lists by topic and have a book club relay, a Christian chat relays, a dev relay, etc. Filter.wine does something similar with WoT. It is controlled by the changes to your follow list. I house the relay and app settings within my events.
What AI+FOSS does is end the concept of the Killer App, where you just code some blob, sell the rights to Microsoft, and walk away a gazillionaire. You will now actually have to add concrete value and do real labour. There has to be some reason I keep going back to This One Person, and that won't be the app. Make Software Development Work Again.
... while at the other end, howeer, most of the AI stuff still is owned by huge corporations that are able to afford the hardware and operation skills to handle AI. In my environment, this has even more pushed moving to the cloud, most specifically Azure as they also host GPT and similar models in European datacenters which eases pain of those who always wanted to take that step but feld hindered so far by GDPR obligations.πŸ™ˆ
Think about it this way: There used to be one or two ovens per village, either at the bakers or in the town square. It took hours to heat up, by packing it full of precious wood, so everyone would get their bread from there, as building an oven and burning all that wood, just for one loaf of bread, would have been crazy. There were usually only a couple of kinds of bread being baked, as the work around the oven was so laborous. Then people eventually got wood-burning kitchen ovens, that doubled as a heater and a stove. Then they baked their own bread, since the time and cost became negligable. But Most people Just baked the same bread over and over. At the most, they'd break it up into dinner rolls, or something. And then everyone got electrical ovens and it got even easier and cheaper... and they all stopped baking. Now, almost everyone goes to the grocery store or the master bakery, where there is a wide, curated selection of bread, and choose from that selection. In the same way, AI app generation won't lead to everyone creating their own apps. It'll lead to people becoming app-generation specialists and they'll offer an array of apps to choose from and further customize, hosting options, services, etc. Or they will concentrate on the data and storage layer, where the value is. The dev talent will have to go one level deeper, in multliple directions, same way that baking had to become a factory good or an art form. That is the effect of commoditization: go upscale or downscale.
Waiting for that, still though unsure to see that coming at some point. Especially in a way so to disrupt business of OpenAI, Microsoft and "friends".
They're already getting crushed by the running costs necessary for AI, especially data storage, analysis, and energy. They're trying to get the processors cheaper, since that's the easiest pressure point, hence the recent NVIDIA crash. Nostr distributes all of those costs. Since your server doesn't have to do all analysis for everyone, but only cater to your own use case and your own data set, and can run asynchronously -- results periodically updating a data file on the server or generating an event for your (private) relay-- it's relatively cheap and you can buy one of the older or less-powerful GPUs for $700-$1500, or just run it on your gaming PC. We literally just listed the embedding spec in the NIP repo this week, and you've already lost patience. πŸ˜… Like, give us at least a year, okay?
Yes, I know (this is what I kind of do most of my days - that integration of own data sets into prompts and responses, talking RAG and all). Still, though, I wonder whether / how Nostr will fit in here. So far, we see these two tendencies of companies / customers either opting for Microsoft Azure / OpenAI - to push cloud migration strategies they had in mind for ages but no chance to get done - or waiting for smaller, local models they can run on-prem with an acceptable amount of hardware. Still, though, with the out-of-the-box approach, using this kind of stuff in Azure is extremely attractive with all of the "features" it offers combined so this is hard to address for a lot of people.
For what I think (though mostly unrelated), we're not even necessarily missing a good AI hosting here. We're missing a good alternative that covers most of the use cases people cover with Azure yet is not provided by Microsoft. An increasing amount of companies and quite some organizations, as I repeatedly have to learn, trust in that stuff. Teams is pretty everywhere. Sharepoint and Onedrive integrates with this. AD and Cloud SSO is used to handle AAA. In some cases, corporations even run virtual desktops on that stuff. Azure computer resources are used to deploy microservices that speak to other "standard" third-party components in Azure (such as Azure-hosted postgresql, elasticsearch, ...). It's a technical sh_t show if you look closer but it still gets you quite far in terms of integration, and their AI stuff tightly integrates with that both on a user level and on an application level. I have no real good idea how to get rid of that, except for really strong regulation (which I don't see happen anytime soon). It's concerning to the max (even though this was a massive deviation from AI on Nostr...).😢
I'm doubtful that we will make much influence on the corporate front, at least for some years. In my experience, it's just too well integrated at this point. So many specialised apps are written for Windows, so it makes sense that they just go all in on Microshit. I begrudgingly use it at work and just accept that it'll probably be that way until I retire. I'm sure we will have another CloudStrike, except far worse, before people even begin to question the stupidity of having most major infrastructure systems in one shitty basket that just keeps growing.
Default avatar
Exodia38 1 year ago
I agree. I wanted to add that, in the future, most people will be willing to pay directly to client entrepreneurs and other service providers because doing so will overall improve their opportunity costs
Exodia38
Ok, now I understand, thanks. My opinion about this topic is that in the future everyone will be able to make clients with pretty little investment (almost zero). If clients become increasingly interoperable over time and everyone can create them, this means everyone has individually the benefits of interoperability BUT that doesn't mean those clients won't come without tradeoffs (see, I'm one of those that think that everything comes with tradeoffs. Also, the more useful or convenient, the more tradeoffs it has) If everyone has to deal individually (or even in a Dunbar number fashion) with those tradeoffs, that means everyone will have to soften them paying MONEY (summing it up: the fact that everyone can make highly interoperable clients almost for free means they will have a clear DEMAND for the things that make clients smoother). Another thing is that individuals won't be really able to deflect the cost of making clients smooth by passing that responsibility to big corporations because that will be economically inviable in a free market Everything I said makes me conclude that there will never be free lunch again because clients will be an eternal source of demand for everything Nostr related (Relays, dev work, etc) and the starting point for a sustained division of labor. So, saying that relays are cost centers is highly gauche
View quoted note →
To be fair, that kind of is what colleges are designed around. It would be neglectful for them to not help people get employed. Not why I'm there though so fuck it.
Yeah, maybe some coding class is a better choice at least from a mental health perspective...πŸ™ˆ
↑