Community Requests for Account Suspensions (3 of 3)

[This document will be published in three parts, such that Part 1 can be referenced in future conversations, Part 2 deals with a very specific set of account concerns, and Part 3 addresses some long form questions posted to the moderators on Tŵt.]

Part 3 of 3 Response to community questions

Several members of toot.wales, as well as users from other Mastodon instances posed several questions and comments around our response to their concerns over one of the accounts that some people felt deserved stronger moderation than they were perceiving. These questions are preserved and answered below.

I would argue that if both moderators and members of the Tŵt.Cymru community don't appreciate the account, that's reason in itself to get rid of it, even beyond any breaching of community guidelines or lack of content warnings on distressing content. It seems the account is not seen favourably by the community

Social media, like all Internet communities, are subject to the 1% Rule, also known as the 90-10-1 rule. This rule reflects the observation that for any given community roughly 1% of the community are vocal, active contributors, 10% respond/participate, and 90% consume passively. The comment above was in response to one moderator, not all moderators, and two toot.wales members, not all toot.wales members. Three people stating their views does not necessarily reflect the broader community, and this underscores why moderation should never be overly reactive to a vocal minority.

Relegating moderation discussion to the report facility, while encouraging auditability, is the exact opposite of transparency — it encourages moderators to form a different culture to the users on the site, entrenches seperation between moderators and regular users, and is not auditable for people outside of the moderation team.

Staff conversations are not for public consumption. Publicly displaying moderator logs, decision-making, discussion etc. opens the door to harassment of both the account being considered for moderation as well as the volunteer staff themselves. In addition, this can easily lead to derailed threads discussing these actions. I do not believe it encourages the formation of a different culture, this is not a de facto outcome of the guideline. Either there is trust between the community and the staff, or there isn’t. If there isn’t, there are larger issues at work than knowing who said what about which account. The transparency will be in the actions of the moderators and the instance itself, not in their private discussions.

A fedi node is part of a community with other fedi nodes. It's very important to see how your neighbours are handling an issue in order to know if you should continue federating with them. Making moderator discussions private means that other fedi nodes, and potential users, will have to go off the whisper network that will ultimately be formed in parallel with your moderation discussion areas.

Tŵt’s neighbours can read our Community Guidelines, observe our adherence to the Mastodon Server Covenant, and explore our timeline. Our social contract is first with users of our service, second to other Mastodon implementations, and third to the broader fediverse that can consume our content and interact with it.

It means that people can state things about your moderation behaviour and you have no immediate retaliation.

There will never be a need for “retaliation”. All observers are free to state their opinion, and act on it accordingly.

And it encourages users to be rash in favour of not interacting or being on an instance that might encourage toxic or otherwise bad behaviour,

The nature of the service is such that some instances will not federate with some other instances. This is a baked-in concept and I have no doubts that some Fediverse participants do not agree 100% with our instance. The same is true for our instance regarding others’. Our block list is public, and is based on our Community Guidelines. See https://toot.wales/about/more#unavailable-content

because ultimately there is no way to know what resolution was sought.

There is generally no need for the entire community to know what the resolution of a particular report was. 99% of moderator actions are of little interest to 99% of the Fediverse. There is a network of instance administrators, there is a Discourse, there is a Discord community, and there are public Pull Requests and comments on Github. It is not at all hard to interact with staff from most instances and the platform is still evolving. There is currently quite a bit of work being done on Suspend functionality, for example. (see https://github.com/tootsuite/mastodon/pull/14726)

In the rare instances that the wider community wants to know about a particular problem, it will naturally surface, and will be addressed, or not, by the instance. Clearly, in this case, we feel there is a need to explain our thinking. This will not always be true. And I cannot be clear enough about this point: everyone is free to defederate anyone else. Tŵt has defederated with numerous Fediverse participants. I would be a hypocrite if I didn’t agree that anyone else is free to defederate with us if we do not meet their personal needs or in any other manner fail to live up to that instance or person’s understanding of the social contract.

In all cases, we are not stifling discussion about moderator actions. Healthy communities allow for appropriate discussion and appeal of moderator actions. And, when appropriate, as in this document, we will seek to clarify those actions in response to community concerns.

Did you take a break for four hours? Or did you refuse to ban them from the instance? Did you give them a second chance that is conditional on their response or their behaviour? Or something else? How can we know any of those things?

You can ask. When and where appropriate, we will answer. The broader subject about whether or not Mastodon should display these audit records in public is a question for the Mastodon developer community, which I urge all who are interested to contribute to with their requirements and suggestions. Open source development requires participation from all users of the software for it to be reflective of its users’ needs.

What is a good time to wait for you to respond? What's a good waiting time for choosing to defederate with you? Or for leaving your instance?

We offer no service level agreement on toot.wales. The best waiting time would be exactly as long as anyone deems appropriate. We are a young, small community with a volunteer staff who all have other jobs and commitments. The nature of the Indie Web is it is self-hosted. We have tried to create some structure around this instance to provide stability and confidence, but no guarantees are made beyond the Mastodon Server Covenant and the language found in our Terms of Service.

How do you protect yourselves against people who will inevitably see this discussion, and form word of mouth that tẘt.cymru is not a safe space for it's members or for people who interact with it?

If that assertion proves true in a broad enough portion of the Fediverse, it would signal the fact that Tŵt is not a viable exercise in its current form, and we would likely close our doors.

How can people inside and outside of the community (read: federated with) view the record of actions that have been taken in relation to accounts, and the reasons why they were taken

(And)

How can people inside and outside of the community figure out the difference between moderators having a day off, versus moderators choosing against acting against an account (For whatever reason is deemed necessary, say an account doesn't meet the criteria, or whatever).

They can ask. When and where appropriate, we will answer.

From the user's perspective, is there a tangible difference between the moderators sitting on their hands, versus the moderators deciding that an account does not meet the guidelines?

All communities must form a mutual trust with their host/moderators/conveners etc. Either this trust exists, or it doesn’t. That trust is built and perpetuated by clear, consistent actions based on fair principles of convention. If a given user does not have trust or faith in a given moderator, host, instance, they should probably find a different moderator, host, or instance.

From the user's perspective, is there a way to view past moderator actions, with the reasons why those actions were taken? One of the reasons for this might be, searching for information on someone who behaved inappropriately and harassed people, who has moved to another instance, and is trying to continue it there.

Not that I am aware of. There is a community of instance administrators where these discussions take place, there are well-informed community users who raise issues and post user-submitted reports, and there is the #fediblock hashtag. These are just a few of the ways we all manage to scrape together a federated community using open standards, and as the community grows, no doubt the underlying software will grow, too.

There are over 3,000 Mastodon servers online. Over 3,000 codes of conduct for everyone to figure out. 3,000 instance administrators to commune, share and learn from each other. I have no doubt it is far from perfect. But at its heart Mastodon and the broader Fediverse is user-centric, user-driven, community-guided. Individual instances may rise and fall in relevance, but continued participation and feedback will help form the necessary community links required for any of this to work.

In summary:

  1. The two accounts in question were not suspended, one of the accounts was silenced
  2. All administrative and moderator activities are free to be discussed, appealed, maligned or endorsed. Questions about these actions should be posed on the Tŵt platform to the Moderator staff. Staff are listed on https://toot.wales/about/more
  3. All moderator discussions about specific accounts are private. Moderators who cannot demonstrate a consistent, impartial approach will have their moderator privileges revoked
  4. Moderator actions will likely not occur in response to DMs or public timeline discussions. Community members seeking action from moderators should file a report
  5. We are exploring ways to convene the community in a regular, representative manner to review the Community Guidelines

All of us involved in running and maintaining Tŵt are grateful for everyone’s participation in helping craft a bilingual, safe, privacy-focussed social media experience for Wales and the Welsh, at home and abroad.

Previous: Part 2 of 3 Regarding specific account suspension requests