Thread

Zero-JS Hypermedia Browser

Relays: 5
Replies: 12
Generated: 00:25:28
Earlier I saw an #AskNostr with a person asking if they should report illegal content on #Nostr. How does the reporting system on Nostr work? Is there a way that relays communicate with each other to flag 'illegal' content? What counts as 'illegal' content - since Nostr knows no jurisdiction. I am trying to understand how 'censorship' works here. It's curious because Nostr could become known as the armpit or 4chan of the internet. I'm wondering how a balance will be created - if at all. ⚡
2023-11-03 18:56:42 from 1 relay(s) 6 replies ↓
Login to reply

Replies (12)

Every Nostr client can basically decide how to implement the reporting system. NIP-56 describes how to construct report events but it's up to each client how to filter out content based on the reports. Some clients have implemented the functionality for reporting nudity, impersonation, spam, and "illegal activity in some jurisdiction". In practise when some number of your contacts have reported a note it usually gets hidden from your feed and there's a button for viewing it anyway. In general the ability to curate your following feed is enough to filter content you want to see and immoral or offensive content is not a real concern for the average Nostr user.
2023-11-03 22:21:53 from 1 relay(s) ↑ Parent Reply
There are two sides to this, the first is protecting the user from having to see the shit, and the second is tackling the content itself. Reports are used by clients to determine which content they hide from other users, taking care of the first problem. As for the second problem, people often forget that Nostr relays only carry text. The actual offending content is stored on other platforms such as image hosts, etc. It is those platforms that have a duty to weed out abuse images, etc. This means Nostr as a whole is reasonably well equipped to tackle this problem without requiring centralized means of controlling and erasing content. It's critical that text can't be erased Nostr wide, that's the whole point. Nostr relays do not talk to each other.
2023-11-03 23:45:31 from 1 relay(s) ↑ Parent 1 replies ↓ Reply
I wonder... If relays dont censor illegal content wont they have legal culpability? Depending on their location of course. For instance the original post i saw talked about reporting child p*rn. If a relay didnt block/censor that would they have legal issues? Or if they allowed "hate speech" which is illegal in certain jurisdicions.
2023-11-04 00:05:08 from 1 relay(s) ↑ Parent 1 replies ↓ Reply
Another way to think about relays is they are the back end to clients, a bit like a distributed database. Clients are on the hook for the content they display (with all due protections as well). Relays don't store anything other than text, so there is no child porn on relays for example, though they could store URLs to other platforms where that content is hosted. Regardless, it is those host platforms which are sitting on the illegal media. There could potentially be words on relays which are "hate speech" or whatever, but if no clients display it then text in a database isn't illegal in and of itself. Relays could be compelled to remove content or provide information just as anything else, but their exposure is pretty small. Compare this with Fediverse instances which federate and store everything locally, fuck what a nightmare.
2023-11-04 00:18:53 from 1 relay(s) ↑ Parent Reply
Millions of you were #murdered during the #covid pandemic by officially sanctioned experts telling lies and spreading misinformation. You are still dying at a phenomenal rate today. YOUNG! and old..🤡🤑💰💀⚰️⏳ The ONLY! ones telling you the truth and saving your lives were the so-called anti-vax conspiracy theorist spreading so-called misinformation..🤔😆 Freedom's not free. Neither is life and DEATH!..😊🫂
2023-11-04 00:44:37 from 1 relay(s) ↑ Parent Reply