I completely reject the idea of:
Dumb relay, smart client
Why would we want the more powerful hardware that is online all day to be dumb?
It has all the potential to do the computing people want and all day to do it
Instead we want clients to, in real time, digest all these notes and compute them, filter them etc?
Anyway I'm gonna keep making smarter relays
Login to reply
Replies (39)
when you frame it like that, it seems kind of obvious for relays to be less dumb
And why should relays store and transmit junk only to be filtered out by smart clients?
Because you can't trust the relay as much as the client. Simplicity is durability, compatibility, reliability.
Really? I don't think one or the other would be more or less trustworthy.
Itβs an interesting question. βEmbrace and extendβ was one of the techniques that Microsoft used to attack open protocols. They would make their implementation better and therefore create a new standard that only they controlled.
The risk would be if your relays extend the protocol, creating lock-in for the custom clients using it.
Same. Dude. Same.
I think alot of the functionality we want in relays is better off as separate services. This isn't to say relays can't do anything clever, but the more we rely on relays to provide these services, the more centralization risk exists.
If you're concerned with centralization of one implementation. Strfry is running on most major relays. This is why we want more.
I want my clients to connect to one relay that acts as a client to the other relays
Yes I also think this is the way
Does it have to be the relay or a separate service?
If it is a separate service aren't we splitting the responsibilities and controls better?
You're running the client. Someone else is running the relay.
smart relay, client controlled ? (api)
Neither is necessarily the case. Moreover, even if the client is doing absolutely everything locally, it doesn't mean I as the user have any idea how it works under the hood. There's just as much trust involved either way.
Nostr's core idea isn't dumb relays.
Mastodon sucks because you don't own your identity, it belongs to the server. That isn't the case with a smart nostr relay that can filter and order notes for you.
I would say the biggest thing that sets nostr apart being open and interoperable.
You want to run relay on your phone? Go For it.
You want to run a relay that needs a data center to analyze every note create custom spam protection, recommendations and only send me notes that it's AI thinks I am interested in? Well sounds a bit expensive and complicated but go for it!
I dunno man.
Iβve been here a while.
I seem to recall that the core idea was dumb relays smart clients.
Now some people want to build smart relays, which is fine. Nostr is a protocol. The weird bit is the people who canβt just say, βhey! I wanna build me some smart relay!β
They have to pull that weird culty millennial thing where they try to change reality to match their desires, rather than just be free to do what they want without gaslighting their detractors into submission ππ€·
Perhaps I am wrong but my understanding of the fediverse is that your identity and your content is tied to your home server.
That seems to be the key difference.
If the content on nostr is open and our identities are not owned or controlled by relays it seems like there is no reason not to experiment with as many relay implementations as possible.
The next evolution of nostr, client that run on servers.
Yup, exactly.
That key difference is suddenly makes Smart Clients more centralizing and discriminatory than Smart Rrlays.
Smart Relays can be accessed by anyone that has even the lightest, dumbest device, and they have a ton of competition from other Smart Relays that sovereign profiles can take their business to.
You want communities/groups to have (and take responsibility for) their own relay, server and fall back options.
Since anyone is inevitably part of multiple overlapping communities you don't really have the whole single point if failure thing.
Haha, that last point resonated π
You are able to choose which client to run, or even write your own. If there's a bug you can switch to another one until it's fixed. Depending on your circumstances you can choose between convenience and security. Relays are shared by everyone, and simplicity improves stability.
A DVM is a smart client
Yes. This makes sense to me.
Brave AI agrees Hybrid Approach is the way. 

I completely reject the idea of:
Dumb relay, smart client
Why would we want the more powerful hardware that is online all day to be dumb?
It has all the potential to do the computing people want and all day to do it
Instead we want clients to, in real time, digest all these notes and compute them, filter them etc?
Anyway I'm gonna keep making smarter relays
View quoted note →
yap

Let's see if this equally applies to relays.
You are able to choose which relay to connect to. - True
You are able to run your own relay. - True
You can even write your own relay software. - True
If there is a bug, you can switch to another relay until it is fixed. - True
Depending on your circumstance, you can choose between convenient relays or secure ones. - True
"Relays are shared by everyone." - Not true
Relays are shared by those who choose to connect to a given relay. Someone may choose to run their own personal relay that only they can post to in order to back up their own notes. Or, they may run a relay to back up their notes and all the notes they care about from others (anyone they follow) that they also allow others to read from, but not write to. Or they may run a relay that is only for a small group of people to read and write to, such as their church. Or they may have the hardware and bandwidth that can handle running a large public relay that anyone can post to or read from, but even then that does not mean everyone will use it.
Relays are just as diverse as clients, if not more so, and users have a lot of choice about what relays they will connect to, or even run their own. So again, I see no real reason why users should trust clients more than relays.
Bringing theπ₯! It seems to me that more people will run clients than relays, and the client has the last word before your eyeballs, so I would say that you have to trust the client more than the relay. And you can only connect to relays that implement all of the features that your client expects, so adding sophisticated features to relays has lower marginal utility than adding them to clients. OTOH, pushing all of the complexity into clients results in fewer clients that are harder to build, and you don't want that either. There's a balance that the community will ultimately find, and my expectation is that it will involve clients testing out new ideas, and relays adopting ones that have become stable and expected.
But what really matters is that it doesn't matter who does what where because it's all an open protocol and gfy we're living in the future! π€ π
To bring it back to the OP's "Why would we want the more powerful hardware that is online all day to be dumb?", I don't think relays should be smart, I think they should be efficient. As you pointed out they need to protect themselves from things like spam to keep costs down and stay alive. As someone else pointed out, we can have things run in the datacenter as other clients / DVMs, and still have simple, efficient relays. So at the protocol level, my expectation is that the more we burden the relays the less reliable and more expensive they'll become. OTOH we can have simple web clients, fancy web clients, slick mobile clients, and colossal workstation clients, and the cost and reliability is paid for by the person running it. We should try and figure out how to compensate relay runners not so they'll add more features, but just so they'll keep doing what they're doing. It's hard enough already and they aren't getting as much out as they're putting in.
And of course: we're all going to do whatever we think is best, because that's what makes nostr awesome! Apparently all we needed to do was remember that we can change the world. How long have we been sleeping π? Well we aren't anymore βοΈ π βοΈ
Well I like my Bitcoin node dumb, "smartness" brings a higher attack surface.
Keep foundations simple.
This mostly boils down to the ancient thin server/thin client discussion.
If you look at it from a trust perspective however a thin relay makes more sense because how does a relay know what i, as the end-user, considers worthy of my attention. Specifically if the relays filter stuff that i am not aware of.
It's a bit the same like email. While most servers actively filter spam emails it's still embedded into most clients as well. However that also lead us to the point where running your own email server becomes a problem as it is now centralized around the big 3 cloud providers that will just actively block half the ip-address space on the internet because those mostly generated spam.
Looking at it from a real-life perspective, social interactions are always p2p so i think striving towards that goal seems to make the most sense.
The concept of βdumb relay,β as it was initially coined, in my opinion should be framed in terms of minimum and necessary functionality. The backbone of Nostr must be able to sustain itself with relays that expose the minimum functionality provided by NIP-01, each addition being a bonus, cooperating with what clients can make available.
This is not a matter of shifting workloads from one end to the other, but of seeking synergy, where the result is richer than the algebraic sum of the two parts.
There will inevitably be overlaps, redundancies, and that is good, as this is also in its way a form of decentralization on the relay-client axis.
you can run both or use relays you trust. for whatever reason.
Yeah servers have a purpose and function, utilizing it fully is reasonable. I think we may want q mix of both complex and simple relays.
Runtime costs of simple relays are lower and that has a decentralization benefit.
We defo need nostr 2.0, but it only becomes successful if there's a uniform interface. #ditto is the best effort so far.
You have it wrong. In terms of "algorythms" the ideal is:
dumb relay,
dumb client
Dumb public relays = from the protocol point of view. From there everyone is free to do whatever they like.