I'm in violent agreement with all of that. But weren't we talking about cyberbullying? What you've described just sounds like filtering content at one's relays, not reporting or moderation. Better UIs to surface relay filters seem like a strong advance in the direction of "be your own algorithm."

Replies (2)

If you know your relay cares about cyberbulling under some definition of that, you can report such activity to the relay operators to have that bully kicked off the relay. If you have a Democrat only relay, you can report Republicans. The domain of censorship is the relay, not the entirety of nostr. The mechanisms of reporting are things like NIP-56, and the mechanisms of moderation are things like NIP-86.
We can't even really see how censorship resistant Nostr truly is until relay operators start removing stuff beyond the agreed-upon spam & other obviously bad things. There's some funny situational irony to that. Creating unique experiences across many relays seems like the most logical way forward, to me, since that can help the user base grow, too. Anything else would just be war games.