

Not sure what that means: the function & signal are both named kill & the name signal is already taken.
However, that’s a fairer position.


Not sure what that means: the function & signal are both named kill & the name signal is already taken.
However, that’s a fairer position.


So the child porn still remains present, effectively.
Compulsory legal compliance still exists: it’s the free, open internet. Did you know laws existed before, too?
Ada
Don’t know about that. I think the brief descriptions on websites like nostr.com did a good enough job: there’s not much to get.
It’s a protocol, not a platform. There’s no global moderation/censorship just like there isn’t on the whole internet. Relay operators have full discretion over the content available on their relays: if they want to do more than the bare minimum, they can. Clients are free to subscribe to other relays or multiple. It’s technically free association rather than anti-moderation.
A user can choose to see only the content of followed users: that should eliminate most unwanted content. Apart from that, there’s no perfect moderation solution even on centralized platforms, so there isn’t here.
Client-side filtering remains the best approach for those who care. It doesn’t have to be manual as I mentioned before.
I recall earlier days of the internet when no one gave a fuck about this, and internet rage was just entertaining, easily ignored nonsense. Then it became eternal summer, and tightass n00bs started acting like moderating the entire internet & foisting their dumbass expectations on everyone made perfect sense without ever having to learn the zen of not giving a fuck. That was the start of when it all turned to shit.


I’m lost. If a mod removes child porn on NOSTR, it removes it for everyone, right?
No moderators. A relay operator can remove it from their relay. There’s any number of relays. It’s roughly like usenet with respect to server-side content removal.
Strange to ask me when authoritative information is public online. Seems you don’t know much about nostr.


Cybersquatting is fun. No one is entitled to a domain name more than anyone else.


displays of tightassery & people sick of it
we should just go for the most offensive source code possible & fuck this noise


kill
like unalive? fuck this generation.


And if they can remove that, they can remove anything.
That’s why there’s no limit on the number of relays (events usually publish to multiple) or subscriptions to them (clients usually subscribe to multiple).
unpleasant time here if you loath all moderation
Still off topic, and the moderators here aren’t reddit moderator scum so far. The modlog offers decent transparency.


Why is that better?
User control & flexibility > illegitimate authority. Also, I remember an earlier, untamed, unrulier, more subversive internet than this corporate-friendly crap: it was funner.
any community anywhere online still needs to remove CSAM and gore and other things.
Legal compliance is different from legally unnecessary moderation.
Because a hashtag under a no-moderation concept could still be hijacked.
Not really: Nostr content is cryptographically signed. User’s client subscribes to some content curators who post as signed events their tags for other events. The client processes these tagging events to filter according to the user’s preferences.
Some proposals already exist:
the fediverse will never be what you want it to be
Not the topic of discussion, which is function & protocol.


Yup, better. Moderation should be opt-in & is better handled at the client: the user could opt-in to a “moderation community” that publishes tags their client would follow. Such curation for anyone who wants that is a better idea. Far better than moderators we don’t get to choose.


I like Nostr over ActivityPub: simpler, more elegant protocol, decentralized, public key signatures, resistance to censorship, anti-moderation.
Content curation is voluntary & left to the client, giving the user greater control/flexibility & preventing moderator tyranny.


I’m astonished at the way you broke markdown to degrade accessibility. Special type of evil or inebriation?


So there are no moderation tools / whitelist/blacklists?
This is a good thing: bitchasses need to learn words are harmless & they can ignore them like humanity has done for millennia. It’s not built into the servers. Client-side tooling would handle it, so it’s entirely at the discretion of the user, which seems better to me.


Needs text alternative (inaccessible content is right wing).
figurehead monarch
actual dictator
same thing! 😊


Any communist that says they want to “form and participate in a vanguard party” has no understanding of revolutions and left theory.
So, did Soviets get the concept wrong? It’s often claimed dictatorship of the proletariat doesn’t mean an actual dictator, yet there it was.
What do we call the Soviet concept of vanguard if not vanguardism?


What a bunch of trend chasers: many of us were doing that before AI. Posers.


I get that, but don’t you get satisfaction raising inconvenient facts with rage addicts to lift their irrational outrage beyond orbit?


What’s yours?


Images of text break much that text alternatives do not. Losses due to image of text lacking alternative:
Contrary to age & humble appearance, text is an advanced technology that provides all these capabilities absent from images.
another case of willful ignorance of meanings of words & political science
not providing alt text is right wing


Everyone eventually dies.
Not accessible: needs link to source for great justice. Lack of accessibility is right wing.