Has anyone ever investigated moderation by sortition?
-
Has anyone ever investigated moderation by sortition?
Instead of this current system where people have to be recruited to a scary "moderator" role, what if everyone on a server got randomly assigned mod requests on a jury basis of say, 3-9 people?
The current status quo just sets up maintainers for burnout: as your instance grows, so does the free emotional labour you're expected to conduct.
What if instead of this we all agreed to take on this responsibility as part of joining a community?
-
Has anyone ever investigated moderation by sortition?
Instead of this current system where people have to be recruited to a scary "moderator" role, what if everyone on a server got randomly assigned mod requests on a jury basis of say, 3-9 people?
The current status quo just sets up maintainers for burnout: as your instance grows, so does the free emotional labour you're expected to conduct.
What if instead of this we all agreed to take on this responsibility as part of joining a community?
@kim @andypiper there's been many experiments about that, but ultimately it comes down to a safety thing: you don't want to have $randomPerson reviewing reports of extreme content like CSAM, TVEC, and Gore.
https://bsky.app/profile/rahaeli.bsky.social has discussed this at length recently, but yeah, some moderation work could be handled by a jury of peers, but a lot of other moderation work requires training and trauma care.
-
Has anyone ever investigated moderation by sortition?
Instead of this current system where people have to be recruited to a scary "moderator" role, what if everyone on a server got randomly assigned mod requests on a jury basis of say, 3-9 people?
The current status quo just sets up maintainers for burnout: as your instance grows, so does the free emotional labour you're expected to conduct.
What if instead of this we all agreed to take on this responsibility as part of joining a community?
@kim sounds like Slash!
-
@kim sounds like Slash!
@kim Advogato had an interesting trust-based mechanism.
-
@kim Advogato had an interesting trust-based mechanism.
@kim mostly, I'd say: let a thousand flowers bloom. Different instances have different moderation structures, and what you're describing sounds really interesting.
-
@kim mostly, I'd say: let a thousand flowers bloom. Different instances have different moderation structures, and what you're describing sounds really interesting.
@kim Reddit's upvote/downvote system would probably work, too.
-
@kim @andypiper there's been many experiments about that, but ultimately it comes down to a safety thing: you don't want to have $randomPerson reviewing reports of extreme content like CSAM, TVEC, and Gore.
https://bsky.app/profile/rahaeli.bsky.social has discussed this at length recently, but yeah, some moderation work could be handled by a jury of peers, but a lot of other moderation work requires training and trauma care.
@thisismissem @kim @andypiper I think it's possible to have defence in depth; multiple layers of moderation. Peer moderation can be a helpful addition to dedicated moderators.
I added an issue to track peer moderation options on the AP T&S repo:
https://github.com/swicg/activitypub-trust-and-safety/issues/107