If we can’t make a social media ecosystem where people feel safe enough to share then they’ll go somewhere else. It’s like a dark ally in a city without anyone doing public safety. You can walk down these back alleys but it’s risky. If you’re strong and have guns you’re fine. Or if you’ve got a security guard. Or if you have a lot of social capital protecting you, most people won’t mess with you. But on nostr we see folks who don’t have that power or protection be attacked with death threats for posting a selfie or doxxed for publishing an accurate content report. This kind of harassment silences speech. We need tools that let folks feel safe enough to share and build communities. Otherwise they’ll opt for the evil mall cop an corporate social media platforms because at least there is some safety in that authoritarianism. Read the experiences if women on Nostr View quoted note → View quoted note →

Replies (68)

🥳🥳🥳
rabble's avatar rabble
If we can’t make a social media ecosystem where people feel safe enough to share then they’ll go somewhere else. It’s like a dark ally in a city without anyone doing public safety. You can walk down these back alleys but it’s risky. If you’re strong and have guns you’re fine. Or if you’ve got a security guard. Or if you have a lot of social capital protecting you, most people won’t mess with you. But on nostr we see folks who don’t have that power or protection be attacked with death threats for posting a selfie or doxxed for publishing an accurate content report. This kind of harassment silences speech. We need tools that let folks feel safe enough to share and build communities. Otherwise they’ll opt for the evil mall cop an corporate social media platforms because at least there is some safety in that authoritarianism. Read the experiences if women on Nostr View quoted note → View quoted note →
View quoted note →
How do we proceed with this? The only thing that comes to my mind are shared blocklists and popular relays closing posting based on the presence on those blocklists. Do you have proposed solution on steps we can take?
Chad Lupkes's avatar
Chad Lupkes 1 year ago
What about an indicator that says "This profile has been blocked/muted by ## users on Nostr." That way before someone chooses to follow a profile, we can see troublemakers before they can see us. A troublemaker can post whatever they want, but if I don't want to see it, I don't see it. @npub1mj2n...z5r2 Nostr DOES allow users to be muted, blocked and reported. But it's not clear how effective that is, and having a clear indicator would help.
I doubt popular relays would sign up to block notes from certain npubs, but first step could be that npubs can express preference not to be mentioned by some other npubs. I don't see why those types of lists would not be welcomed by everyone - if someone is publicly willing to say - I don't want this npub to interact with me, the rest of the network should support that.
Moderation lists with the ability to subscribe created by users, each one will choose their moderators... it is my solution and I don't know how difficult it is to implement that here 👀
I feel like this would not fly 😂 I would of course be totally fine with it, but then you have to think about people abusing the blocking function. Then the numbers wouldn’t truly represent the user and their content (I’m sure I’d have a lot of blocks but if you know me I’ve only been kind on here). I think it’s just a matter of considering options and seeing it as an issue, like the poster mentioned versus completing ignoring. I’m glad a lot of different people are talking about it today 🥹
Or recruit women to work on clients and nostr services. A more diverse team building the apps will mean that the app serves the needs of more kinds of people. Talk to users! Talk to people who’ve decided not to use your apps. Ask open ended questions. Read this: The Mom Test: How to Talk to Customers and Learn If Your Business is a Good Idea when Everyone is Lying to You
The goal is to be free to post . Care free . Not worrying about who won’t make you feel safe ! At some point we have to realize the naysayers don’t matter
And yeah, Primal already has this in a way. You can still post messages to relays... but it's possible to subscribe to each other's mute lists. So it's more of a problem of organizing community and then having support of clients to maybe eventually get to some kind of default "no assholes" list. image
Chad Lupkes's avatar
Chad Lupkes 1 year ago
And a lot of that is thanks to you putting your perspective out there, which is at the core of real social media. We can't control what other people do or say, but we can feed positive or negative energy with our reaction. Like @Toshi mentioned, humanity is humanity and that includes both positive and negative. We just want to make sure that we are feeding the positive.
Would it be possible to build a limiter like what twitter has where people you dont follow can't reply or otherwise engage with you? If people who are harassed are given control to consent to social interactions as a starting point, that might be worth considering.
Chad Lupkes's avatar
Chad Lupkes 1 year ago
@rabble has anyone done work to create a 'social circles' visualization based on nodes and follows on the Nostr network yet? There's a lot of opportunity there for our social network to be open and clear about the "six degrees of separation" between profiles, and I think catching it early enough so that we could see those circles form and grow would be amazing to watch.
zoé's avatar
zoé 1 year ago
orrr we could realize that there’s no need in making someone feel unsafe. communication should matter- it’s important to hear the naysayers. after all, we all have the ability to grow together, regardless of who was “right” or “wrong” first. feeling free to post is needed, yes, but so it is to build spaces in which everyone can thrive and be challenged without being pushed out. get it?
Hello 👋, how are you doing? I'm a professional Bitcoin miner,Our team here help educate people all around the world to help them start making profit and income through bitcoin mining We work closely with them to have them learning and earning as quickly as possible. I can help you mine Bitcoin and make awesome profit, Just let me know if you'are interested
First the weirdos build something cool, then some more weirdos come and make it cooler, then tourists hear of the cool place and corpos project economic potential, they both demand the sharp edges be rounded and the weirdos be muzzled for tourists safety and brand safety. This is Cultural Gentrification and I have seen it happen countless times before.
rabble's avatar rabble
If we can’t make a social media ecosystem where people feel safe enough to share then they’ll go somewhere else. It’s like a dark ally in a city without anyone doing public safety. You can walk down these back alleys but it’s risky. If you’re strong and have guns you’re fine. Or if you’ve got a security guard. Or if you have a lot of social capital protecting you, most people won’t mess with you. But on nostr we see folks who don’t have that power or protection be attacked with death threats for posting a selfie or doxxed for publishing an accurate content report. This kind of harassment silences speech. We need tools that let folks feel safe enough to share and build communities. Otherwise they’ll opt for the evil mall cop an corporate social media platforms because at least there is some safety in that authoritarianism. Read the experiences if women on Nostr View quoted note → View quoted note →
View quoted note →
> ... where people feel safe ... This is the problem. You can make people "feel" safe but you can't actually make them safe. The 'safety' provided by centrailzed platforms is an illusion. I can go on any platform, sign up and post horrible things. Sure my account will get deleted, but it will be too late. Fortunately, the danger is mostly an illusion too. No one can physically harm you over the internet. Sticks and stones, etc. If your mental state is really so fragile that you can't handle any criticism, you need to seek help. Doxxing is a real problem. The only protection for that is taking privacy and anonymity seriously and YOU are the only person that can realistically do that. No rule, law or policeman can prevent you from sharing your personal information. It is completely brain dead to use your real name on public forums (including Facebook, Twitter and Nostr) and still expect to retain privacy. When you put your information out there, you become a public figure and lose both practical and legal privacy protections. I always hear people saying things like "oh it's all out there anyway" or "the government already knows everything." First, those are bullshit. Second, you're not improving matters by being lazy. There are real steps you can take that will make you much safer. If you would rather make up excuses than learn, that's your problem. "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety”
Yeah, that's basically what I'm proposing. On protocol level it's possible to craft any type of note, but relays could check - if a note contains reference to npub that blocks mentions by you - they wouldn't accept or propagate your note. They also could stop returning your notes if the request is by certain npub.
Maybe devs could make a #sparta button ? One that mutes and blocks both ways… like being kicked into the void - that would give anyone feeling under threat a way of feeling a bit safer… if they came back with another npub just did it again… just my two sats worth
There is something to the bluesky feedback about nostr being more akin to 4chan (with b!tcoin chat lol). But Eg Coracle's web of trust slide, amethyst's security filters solves a lot of this if you don't want that experience. No idea what Damus offers, but I'm mostly on these two clients and happy with them. I rarely ever look at the global feed (mostly because it's just nonsense and even more ai b!tcoin 'art' and slogans - so I dont even use amethyst's filters). It's like looking at your spam folder, it's not even as entertaining as 4chan.
rabble's avatar rabble
If we can’t make a social media ecosystem where people feel safe enough to share then they’ll go somewhere else. It’s like a dark ally in a city without anyone doing public safety. You can walk down these back alleys but it’s risky. If you’re strong and have guns you’re fine. Or if you’ve got a security guard. Or if you have a lot of social capital protecting you, most people won’t mess with you. But on nostr we see folks who don’t have that power or protection be attacked with death threats for posting a selfie or doxxed for publishing an accurate content report. This kind of harassment silences speech. We need tools that let folks feel safe enough to share and build communities. Otherwise they’ll opt for the evil mall cop an corporate social media platforms because at least there is some safety in that authoritarianism. Read the experiences if women on Nostr View quoted note → View quoted note →
View quoted note →
I will talk to the CEO and we will get this fixed right away! We don’t want anyone to feel unsafe!!!! 🥳
Interesting so far one of the few notes I have read in these threads trying to solve the issue. A mute list of previous offenders could be useful to the woman of Nostr looking for a safer experience. Opt in of course.
It is sad to see that some women on Nostr are getting harassed. I know there are no perfect solutions yet, but I appreciate those who are thinking it out and working on it. Sadly, there are always going to be assholes. The Mute button works for most of them, but when it comes to serial harassment, these people can spin up new accounts easily and continue to harass.  Good idea of improving ways to curate and share Mute Lists. View quoted note → View quoted note →
I’ve gone though the replies to this thread and muted the obvious offenders, though I’m wishing for some way to ensure my presence here benefits them in no way, and that my computation is not used to convey their toxic bits. I liked that property/idea of scuttlebut even though it was maybe more of an idea than it was effective.
I see this as a client failure to implement relay configuration/outbox model. Clients experimented and went down a path of free and open relays, with zero-conf relay configs. They thought they could rely on WoT and mute lists, and put off implementing outbox. With the outbox model, you can set what relay to use as your INBOX and OUTBOX. The INBOX is the ONLY place you read comments and reactions from and you can setup a relay to do this like @Laeserin 🇻🇦 has already done. She has a relay that limits posting to only her follows (using listr) + any other pubkeys she wants on there (setup by relay.tools). The only additional thing relays can do is to add NIP42 auth so that you could have read restricted to that same set of pubkeys. This would enable tiers of protection from anything nostr throws at you. The benefit of outbox vs. mute lists, is the pre-emptive ability to control what you see. Seeing stuff is disturbing, and muting it after seeing it, this is not good enough on it's own. And unfortunately, even when someone sets up a relay config in one client that works, if they jump to a client that doesn't use relay configs properly they get a barrage of all that stuff they didn't want to see (like primal). Anyway, I gotta run, but I wanted to help describe what I see on my end, since I spend a lot of time on the relay side of things and I am well familiar with comment bots (which are technically no different than humans following someone around and commenting on them).
Relay-based filtering. Only load comments, replies and mentions from relays that are deemed (by you) to be safe. That should be the default in all clients. What relays are safe? Relays that will require some form of friction for accepting events, and that will actively ban people that have been flagged with kind 1984.
Well said, Rabble. It's hard to find a working solution though. We're never going to have one that makes everyone happy, but I do have faith that we can have one that makes the masses happy. We just need to attack this understanding that no system is perfect, but we're looking for the one that checks the most boxes and can give us the most user success.
Yes, but we could create filters for new users and use a web of trust to filter out these accounts. Eventually with good LLM’s you could create troll accounts that do the work of being legit before they start their abuse, but for now we’re not yet facing those issues.
We’re working on a moderation bot that will accept encrypted report notes, then the bot can review the content via AI and a human, and then issue a report on that content. Basically it’s not always a good idea to be public about your report. Also clearly we need the ability to vet reports, because often the report is not accurate or isn’t enough. So i an say, sure bob reported jane as being as spammer, but I don’t agree, or this moderation group looked at Jane’s content and decided Bob’s report didn’t have merrit. Sure others might agree with Bob’s report, but the point is that we need a second level, and the ability to have some reviews.
@Rabble what is the problem? Trolls or people not agreeing with someone's post??? I have only seen maybe four social-justice-warrior trolls and a couple of troll-trolls. But definitely not an issue with nostr. Also the example you gave is heavily biased and shows nothing of the problem. The "I am a marginalized woman and you don't understand me" doesn't work. We are all marginalized these days and everybody is a unique individual. This means nothing to me. Please give examples of the exact problem.
τέχνη's avatar
τέχνη 1 year ago
Nostr is like putting things on a telephone pole on a busy street corner. You would never do this if you wanted to be “safe”. How nostr works is not conducive to safety. Users are second class and Relays are first class. We should look to other protocols for safe and private communication where users have control of their data. Something like Urbit or maybe the next iteration of SSB
Too easily gamed. 733T-speak is enough to defeat a regex filter. "Shakespearean" insults and threats get past the smaller LLMs, too. Any LLM smart enough to defeat a level two troll is not something a client can afford to run, (or even most relays). Actively shared mute lists works for email
NOSTR is anonymous unless they doxx themselves. Snowflakes are mistaking “unsafe” for “offended.”
the axiom's avatar
the axiom 1 year ago
What happened with the idea of community relays? Once you have a closed community it becomes really hard to harass.
RUN BIYING's avatar
RUN BIYING 1 year ago
Is there a 100% safe place online? Instead of trying to build something “perfect” for people to feel safe to post, the self-protection and awareness of the potential harassment of internet is more essential, since you can not control who has or has no access to online content.
🐈's avatar
🐈 1 year ago
how do we provide a safe nostr experience?
Yes. Fiatjaf gave me the master key. You can please the majority, but never please everyone. We should build tools for the masses, but also not completely alienate the small groups of users either. It's a tough job, but as I said, I have faith. We have the smartest people working to build us a better future.
You are beginning to see what we went though on Fediverse side with the censorship-happy policing of speech. Nostr clients have the mute button....that's all you need
Love that complete philosophical dodge and superiority complex. So grateful that we have people like you taking care of us and knowing what is best for everyone <3
Josua Schmid's avatar
Josua Schmid 1 year ago
I object to the notion of „anyone doing public safety“. We all should be part of public safety. You cannot buy or employ it. If each individual stepped up, there‘d be no dark ally. Like in the little remote village where everyone knew each other. In functionally proxied social groups the concept „public safety“ is not needed. We should rather work on the proxy mechanisms.
I've long been a hater of Twitter's "flat" model, where everything comes out of the same firehose. It means it's impossible to moderate communities like you can on forums or on sites like reddit. I see Twitter-clone nostr falling into this same trap. Unmoderated internet fucking sucks. This comic is from 2004ish (pre modern social media) and is still just as true today, if not moreso now that we're on impossible-to moderate platforms. image
Honestly I prefer chat above both. People are usually more cordial with each other since there's not an audience to talk to.
I struggle to understand how the mute button doesn't solve this? In my experience, the mute button seems to be incredibly effective in eliminating noise. I would say way too effective than my Twitter mute experience .
Maybe you should leave. Honest question. Risky freedom is the only freedom. There is no safe freedom. Never has been. Never will be. Personally I hope you stay!
Heres another:
rabble's avatar rabble
If we can’t make a social media ecosystem where people feel safe enough to share then they’ll go somewhere else. It’s like a dark ally in a city without anyone doing public safety. You can walk down these back alleys but it’s risky. If you’re strong and have guns you’re fine. Or if you’ve got a security guard. Or if you have a lot of social capital protecting you, most people won’t mess with you. But on nostr we see folks who don’t have that power or protection be attacked with death threats for posting a selfie or doxxed for publishing an accurate content report. This kind of harassment silences speech. We need tools that let folks feel safe enough to share and build communities. Otherwise they’ll opt for the evil mall cop an corporate social media platforms because at least there is some safety in that authoritarianism. Read the experiences if women on Nostr View quoted note → View quoted note →
View quoted note →