Federated Moderation: Towards Delegated Moderation?

Note: This topic was first posted to SocialHub, and has a Fediverse Futures companion on Lemmy.

Introduction

In Improving fediverse culture and social behavior along the way I introduced two ideas, that are I think interesting enough to warrant a separate thread. The brainstorm will start in the Lemmy Fediverse Futures ideation space, and elaborated here (if there’s interest). See Lemmy: https://lemmy.ml/post/60475

Note: There’s ongoing research on Moderation by @Audrey and @robertwgehl. See:

(The sections below will be duplicated to Lemmy brainstorming post)


Moderation on the Fediverse

Right now when people install federated server instances of any kind that are open for others to join, they take on the job to be the instance admin. When membership grows, they attract additional moderators to help with maintenance and assuring a healthy community.

I haven’t been admin or mod myself, but AFAIK the moderation work is mostly manual, based on the specific UI administrative features offered by a particular app. Metrics are collected about instance operation, and federated messages come in from members (e.g. Flag and Block). There’s a limited set of moderation measures that can be taken (see e.g. Mastodon’s Moderation docs ). The toughests actions that can be taken are to blocklist an entire domain (here’s the list for mastodon.social , the largest fedi instance).

The burden of moderating

I think (but pls correct me) that in general there are two important areas for improvement from moderators perspective:

  • Moderation is very time-consuming.
  • Moderation is somewhat of an unthankful, underappreciated job.

It is time-consuming to monitor what happens on your server, to act timely on moderation request, answer questions, get informed about other instances that may have to be blocked.

It is unthankful / underappreciated because your instance members take it for granted, and because you are often the bad guy when acting against someone who misbehaved. Moderation is often seen as unfair and your decisions fiercely argued.

Due to these reasons instances are closed down, or are under-moderated and toxic behavior can fester.

(There’s much more to this, but I’ll leave it here for now)

Federating Moderation

From the Mastodon docs:

Moderation in Mastodon is always applied locally, i.e. as seen from the particular server. An admin or moderator on one server cannot affect a user on another server, they can only affect the local copy on their own server.

This is a good, logical model. After all, you only control your own instance(s). But what if the federation tasks that are bound to the instance got help from ActivityPub federation itself? Copying from this post:

The whole instance discovery / mapping of the Fediverse network can be federated. E.g.:

  • A new server is detected
  • Instance updates internal server list
  • Instance federates (Announce) the new server
  • Other instances update their server list
  • Domain blocklisting / allowlisting actions are announced (with reason)

Then in addition to that Moderation Incidents can be collected as metrics and federated as soon as they occur:

  • User mutes / blocks, instance blocks (without PII, as it is the metric counts that are relevant)
  • Flags (federated after they are approved by admins, without PII)
  • Incidents may include more details (reason for blocking, topic e.g. ‘misinformation’)

So a new instance pops up, and all across fedi people start blocking its users. There’s probably something wrong with the instance that may warrant blocklisting. Instance admin goes to the server list, sees a large incident count for a particular server, clicks the entry and gets a more detailed report on the nature of said incident. Makes the decision whether to block the domain for their own instance or not.

Delegated moderation

When having Federated Moderation it may also be possible to delegate moderation tasks to admins of other instances who are authorized to do so, or even have ‘roaming moderators’ that are not affiliated to any one instance.

I have described this idea already, but from the perspective of Discourse forums having native federation capabilities. See Discourse: Delegating Community Management . Why would you want to delegate moderation:

  • Temporarily, while looking for new mods and admins.
  • When an instance is under attack by trolls and the like, ask extra help
  • When there is a large influx of new users

Moderation-as-a-Service

(Copied and extended from this post)

But this extension to the Moderation model goes further… we can have Moderation-as-a-Service. Experienced moderators and admins gain reputation and trust. They can offer their services, and can be rewarded for the work they do (e.g. via Donations, or otherwise). They may state their available time and timeslots in which they are available, so I could invoke their service and provide 24/7 monitoring of my instance.

The Reputation model of available moderators might even be federated. So I can see history of their work, satisfaction level / review by others, amount of time spent / no. of Incidents handled, etc.

All of this could be intrinsic part of the fabric of the Fediverse, and extend across different application types.

There would be much more visibility to the under-appreciated task of the moderator, and as the model matures more features can be added e.g. in the form of support for Moderation Policies. Like their Code of Conduct different instances would like different governance models (think democratic voting mechanisms, or Sortition . See also What would a fediverse “governance” body look like? )


Note: I highly recommend to also read the toot thread for this topic with many people responding with great insights: https://mastodon.social/web/statuses/106059921223198405

I like this thread. I was toying around with a similar idea recently, with inspiration from Scuttlebutt[1]. Essentially, decouple single-person hosting from moderation and discovery, so people can delegate their moderation (and discovery) to someone else, or multiple someone elses.

Think about this UX design:

  1. The client is the server. P2P, essentially.
  2. When you first sign up, ask the user what topics they like. Recommend people to follow, the usual.
  3. Then – and this is important – ask the user what topics they don’t like. Then give them a list of moderators to delegate to.

But I am wary about delegating moderation. As it can scale infinitely, I think having delegated moderation may result in a lot of consolidation, especially with MaaS, and undermine the soverignty of the people’s own experiences. Even though technically they can always choose someone else, the huge change, in blocklists for example, may discourage users from doing so.

I like the idea of federating (suggested) moderation. It makes it clear, UX-wise, that the person (or admin is in charge of their own experiences).

Why? Well…

Consensus vs Reuse vs FromScratch

  • Consensus in a community: Whatever anyone in the community does, it automatically affects everyone else in the community. For example, instance moderation or choosing what to eat for dinner.

“You cannot wrangle consensus from 500 people.” - https://runyourown.social/
(My note: I suspect the limit is 50 to 100 people, number also taken from the website.)

  • Reuse in a community: Whatever anyone in the community does, it can be reused by anyone else in the community, but it doesn’t automatically affect everyone else. For instance, forking a codebase and making a small change, without intending to merge it back.

  • FromScratch: Well, there’s no community here, is there? But it’s still important that you’re able to do this.

Note: The definition of community here can mean the whole world.

[1] In scuttlebutt, you can either loudly Block someone (which is intended to cause discussion), or quietly Ignore someone.

1 Like

Have you seen #fediblock as hashtag to share block announcements among administrators?
(e.g. from the instance I’m on: #fediblock - Layer8 in Space )

1 Like

Would something like the Block Party app contain valuable lessons to learn from?

Discovered via

1 Like