Maybe people didn't mention it as a problem because they didn't know it was happening. It took a person seeing CSAM under their likes for me to hear about it. Most people who responded to my note are surprised about it. I'm not trying to turn the mob against you, I just care about users not seeing child porn. I even contacted a lawyer from EFF about this late last year, because it's extremely concerning to me, for nostr in general. This is why I use Tagr. I've only spilled so many words on it because you didn't seem to understand my point, not because it's some moral crusade for me. I'm disappointed it had to turn into a flame war. That was not my intention. Internet 1, hodlbod 0 I guess

Replies (1)

Did they see the CSAM in the likes in Coracle or in another client? See.. I take the fact that they didn't know as working well. These reactions are out there. And reports are the things that gather the most amount of heated debate between users that are reporting each other. Reports generally escalate rather quickly. So, if this was a real problem, this thing would not only be known but it would be all over Nostr. It would be our anti-spam filters debacle all over again. If my actions are leading to other clients breaking and it's my fault, I am happy to fix it. But otherwise, there should be space to play around with different implementations for things and this is one of them. This type of difference is what makes different clients different. I think what we do is postive to the network. Other clients might disagree and not send the reaction. Receiving clients can filter it out or use it in any way they see fit. And users can pick the client with the approach they like (or that they don't bother about). It's all good.