This seems to follow the same principles we’ve been talking about for Nos social. The major difference is we are leaning into NIP-32 labels for “passing judgement” on users and content rather than lists, which I think is more expressive and will scale better. For instance you can have client-side rules that say things like “if 1 of my friends reported this person put a content warning on their stuff, if 3 reported them don’t show them at all”. We’re about to start building a web app that allows relay owners (or any user) to view content reports take action on them. I think shared blocklists are still really useful in the short term. This is the main way Matrix fights spam to this day and it’s still working for them at their scale I believe. It would be nice if we could leverage the existing mute lists (kind 10,000) somehow, as there is already a lot of valuable moderation-type data sitting in those. I would be careful about using the word "block" because it implies that the other user can't see your content if you've blocked them (which is true on most social platforms today but not Nostr). I wrote some more about our vision for moderation here: naddr1qqxnzd3cxsurvd3jxyungdesqgsq7gkqd6kpqqngfm7vdr6ks4qwsdpdzcya2z9u6scjcquwvx203dsrqsqqqa282c44pk /

Replies (2)

My thoughts: Legal content that isn’t abusively spamming the system should never be blocked by the system. But if individuals want to block someone… yes, they should be able to do that. As for setting up to auto block anyone that three of my follows/ftiends have reported - well that would suck because i often follow spinmeisters & other shills so I can expose their lies. And these bad players often gang together to get truthful & good content &accounts banned. So please don’t make it easy for them. On the contrary how about making an algo to temporarily take away reporting privileges when they’re obviously mass /spam reporting and/or they’ve reported X amount of content that turned out to not be in violation?