There is a pubkey who has uploaded over 6500 files to nostrcheck.me Many of these files with adult content, I have reviewed each upload by hand, one by one. Two days ago this ‘person’ started uploading illegal content involving underage children. I'm tired, very tired. For me #nostr is a breath of fresh air, but this is beyond me, this is an abuse to all of us, from the victims of the pictures to myself. #asknostr we have to do something about this. Seriously P.S: never donated a single satoshi because of the indiscriminate abuse of my server, never.

Replies (92)

unfortunately this is just the nature of public uploads. best we can probably do is automatic detection using ai.
This is exactly why I don't do any public server stuff cause I don't want to deal with it, but I have mad respect for the devs who fight through it and do it for the community. I tried making a public blog website once, it got spammed with ads on Russian funeral services . I decided then I didn't want to.
That's is one of my fears about what could happen if we fix the relay ip address we could not track this sick animals down.
Quentin's avatar Quentin
There is a pubkey who has uploaded over 6500 files to nostrcheck.me Many of these files with adult content, I have reviewed each upload by hand, one by one. Two days ago this ‘person’ started uploading illegal content involving underage children. I'm tired, very tired. For me #nostr is a breath of fresh air, but this is beyond me, this is an abuse to all of us, from the victims of the pictures to myself. #asknostr we have to do something about this. Seriously P.S: never donated a single satoshi because of the indiscriminate abuse of my server, never.
View quoted note →
Jon's avatar
Jon 1 year ago
Running AI against this shit also costs money. Indeed, uploading media should probably cost a few sats
jgx's avatar
jgx 1 year ago
How about some initial sats payment that is automatically reimbursed if the relay operator does not decide to slash it? That would allow for much higher prepayment but come at low cost for honest users
It’s risky to do nothing; we don’t want any developers getting involved in illegal activities in the name of freedom of speech or resisting censorship. Crime should be prevented, especially when it involves underage individuals or other serious issues. Governments often target those who enable such activities, so the facilitators would be held responsible.
Josua Schmid's avatar
Josua Schmid 1 year ago
I’m don’t know about your legislation, but at least under Swiss law if you’re a hoster you only have to follow reports and notify the police if you discover something. As a provider you’re not liable for hosting unknown content. You wouldn’t need to review this shit by yourself. But I can see how you feel it’s your duty.
You might look into iftas.org. They've shown up in the last year to help fediverse admins deal with stuff like this. Right now it's mostly just forums/chat, but they're planning to create some tools to help automate this kind of thing.
Sorry to hear that Quentin, thank you for your service. It is really hard to handle those issue as media service provider. Maybe, you can take a look on and might be modify it. I've used that as module for nfrelay.app . It is effective and really help to handle the issue. Hopefully, it can also help you.
There is PhotoDNA that is free of charge and does it well. I don’t think we should spend our limited resources on that. If I were to do it, I could have a age detection AI model and nude detection model in a multimodel setup, but PDNA is definitely better and an industry standard 🐶🐾🫡 DM me if you wish, but consider the above
Well, you already said very similar strategy that i want to propose. No need to DM amymore 😄 Yes, PhotoDNA for known data while multimodel setup using nude detection + age detection model for unknown data. It is also my proposal idea.
We could work on the multimodal thingy together then. @Quentin and I have been experimenting with general detection of safe/unsafe media. I can share my repo (still closed source for now until I am sure the code is safe and clean) and take it from there. Interested? 🐶🐾🤔
Default avatar
nobody 1 year ago
So sorry you are fighting this right now Quentin. I’m sure you will find a responsible solution. People spreading this sort of thing need to be in prison. Plain and simple.
Is there an AI bot which can detect anything like that and automatically delete + block + report to the police? Having to see that sort of stuff would be scarring, no one should have to deal with it.
Are uploads signed with nsecs? Maybe a web of trust scheme or reputation based rate limiting would help. npubs without a clean history or low connectivity go to the back of the review queue.
Default avatar
Mr Example 1 year ago
Cloudflare has that as a service. It doesn't even cost anything. I think it might even be on by default if you're using their file hosting services. Includes automatic reporting to law enforcement and follow up by them so you don't have to get involved. Sometimes, some things are best left to multinational corporations.
As @jb55 said, probably needs NSFW filters that use AI at both the client, relays and/or public hosting servers. I looked into using an open source AI model for @Damus (I think from Yahoo from memory). At the time the code used the Kingfisher library which made it a little more difficult to integrate and made the Damus app significantly larger. Server side this will have a running cost, and I think if people want those services they will probably have to pay for it. I would prefer relays that did this.
I think that was the so Sphinx chat model. You give up sats to join a group and send messages and if the chat room owner decided it was spam they could keep the sats.
jgx's avatar
jgx 1 year ago
which is a perfectly viable model I think. It would even allow to make it free for honest users and very costly for dishonest ones. There's always the possibility of rugs by the relay operators, but that's the risk a user is taking.
RATE LIMIT by BANDWIDTH to all RATE LIMIT by SPACE to FREE npubs AUTO CONTENT DELETION by AI scanning TAG & NIP-42 based whitelisting for adult content IF u wanna allow those. There is no full manual solution unless you place MITIGATION tools in place for CDN service. its better have slow service than no service or shutting down service
jgx's avatar
jgx 1 year ago
yeah, but it is an opt-in model and these things can be handled on the social layer through reputation. Additionally, we are not talking about huge sums. A deposit of 5$ per user would already kill almost all spammers, the individual loss in a rug is low.
1. npubs literally take less than a millisecond to generate 2. tor allows an attacker to keep changing their IP every minute or two, defeating IP based block/rate limiting 3. wasting resources recognising images with an AI engine is costing money 4. it is an assumption without any basis that this attacker wants to pay to make this happen, and if they do, then there is per-subscription limits, and the option that abuse can cancel the subscription bitcoin survives because it pays people to protect it, subsciption services survive by charging people to use them
Jon's avatar
Jon 1 year ago
I guess so 🤷
less than a millisecond for a new npub, less than a second to auth with it, resume spam paid subscriptions can still be attacked trying to break the security but they can't upload data the server is dropping automatically and ceasing to respond to it there are many countermeasures, please, you, and quentin, study some network security theory google and yahoo and protonmail and cloudflare deal with this bullshit every day, and you don't know it until someone finds a crack and jams a wedge in it
this is very good tip for #photoDNA or any selfhosted AI model @Quentin to integrate this foss software into nostrcheck package. (using external free/paid indexing or content searching means dependency on another 3rd party service i.e. BAD by design) - there is nohing gives 100% guarantee in cat n mouse game BUT MITIGATING 80% solves good part the problem. also look POW and/or NIP-42 also which limit attackers limit resource also.
PoW, or fees, either way we have lightning network integrations everywhere PoW will only last as long as it remains more expensive or complicated, after that, only subscription fees will stop them bitcoin survives entirely because it costs more to attack it than the benefits it provides
I feel you and thank you for having to endure that. Sadly, I decided to shut down my free relay for a similar reason. There was some Japanese accounts that kept sharing CSAM and I don't want any part of the stress with wack-a-mole blocking them. Was best just to shut it down imo. Still operating a paid relay, though.
the axiom's avatar
the axiom 1 year ago
yes, by running a free service like that he is just inviting criminals in and then complaining
Glacier's avatar
Glacier 1 year ago
Perhaps there's some way for a version of a mining pool @npub1qtvl...7dze s proposed solution for miners to construct their own block templates, for a nostr version to support solo/larger/all relays to construct their own content templates? It's a delicate balance, guess this is the behind the scenes world of any content platform, confident that the #nostrmind can solve this. Bitcoin is only starting out on the mission of decentralising mining @skot9000 @Solosatoshi.com 🇺🇸 feels like we should get the jump on the relays before we end up with the big 3... @captjack 🏴‍☠️✨💜 @Luke Dashjr @Bitcoin Mechanic npub1wnlu28xrq9gv77dkevck6ws4euej4v568rlvn66gf2c428tdrptqq3n3wr
shadow06's avatar
shadow06 1 year ago
Sounds like an old problem... Email -> Spam
Default avatar
Karl 1 year ago
block the pubkey?
Running cost of running a relay: - Hardware - Bandwidth - Management - Moderation The last one seems like the biggest problem considering the quoted example. Unless you have the funds to hire a team to do this and develop systems to help detect and mark them for deletion, the current manageable solution, for an individual running a relay, is to only moderate medium to highly reported posts. Is this the ideal solution? Probably not. It is the most reasonable one though. As a user, I don't expect the behavior/work of what's being quoted. Thankful, but I'd never put such a heavy expectation on relay/server operators. Shitty user: "Why haven't you deleted these posts yet?" Operator: "Did you report it?" Shitty user: "No" Operator: "Did you tell your friends, family, followers to report it?" Shitty user: "No" Operator: "Then how do you expect me to know about it?" Shitty user: "Spend time reviewing these posts and/or hire a team to do it!" Operator: "Are you willing to help cover the costs, and ask your network to help too?" Shitty user: "No" Operator: "Then you should not expect me to do the impossible. If I see them, I'll delete them. Otherwise, even if all of your answers are yes, it'll still take time to get to them, and a few will slip through the cracks. We do what we reasonably can." Shitty user: "That's not good enough." Operator: "..." View quoted note →
This is my biggest concern and worry with Nostr. This could be used against you, easily, if this is your personal hard drive, or directly connected to you in any way. It would be WAY too simple for the Feds (of any country), to upload this stuff and then arrest you. You’d have no recourse. So, Nostr devs. What is the solve here? Is there one? I would have set up MANY nodes already except for this significant problem.