What is meant by prioritising Trust and Safety?

Table of Contents

The Question

During the introduction of the governing board at the Matrix conference, the consensus was that something needed to be done about "Trust and Safety". And that it will be the priority.

No one really elaborated much on what that meant, and people still didn't seem to know. Someone even suggested "We should enable the Trust and Safety people to figure it out".

What it probably means

When people say "the priority needs to be trust and safety" they're thinking about different things.

The main ideas seem to be though:

  1. If you moderate a public Matrix room, it is a bad experience.
  2. If you moderate matrix.org itself, it is a bad experience.

Most people in the matrix community have experienced the former, and some of them have experienced the latter.

Why does Trust and Safety need to be a priority?

But why does Trust and safety need to be a priority? It wasn't a priority before? Why should it become one now?

Trust and Safety has not been the top priority for Matrix.

I don't know what to tell you, but if you take a look around at how this thing is built, it's clear that safety has not been the top priority in the design of this system1.

Why has Trust and Safety not been a priority?

Vendors, Element in particular, have been responsible for the majority of Matrix development. The customers that they have got the most money from do not use Matrix in the way that the online community does. They don't have random accounts join their rooms from the open federation to talk or post spam. They have closed federation, and all the users in this federation are managed. Feature development and day to day maintenance therefore does not have to consider the same experience that open federation Matrix does2.

This has had a significant material impact on the state of Trust and Safety in the Matrix ecosystem. Notably, vendors have consistently de-prioritised Trust & Safety, providing their team with scarce resources. And scavenging what little resources they had from them to make ends meet3.

This work so far has provided very little direct revenue for any Matrix vendor. At best, revenue is speculative and provided indirectly by attracting communities to Matrix which would then attract people who would advocate Matrix in their day job.

This is not directly profitable work, at least not in a conventional way. Communities are not able to pay service contracts that can sustain expensive developers, and neither is matrix.org.

What are the biggest liabilities?

The biggest liability for the foundation is the operation of the matrix.org homeserver, the largest known homeserver in the federation. matrix.org is therefore largest source of abuse in the open federation4.

I don't know what firefighting for the foundation is like today. But I do know what it has been like, and the biggest hits to morale have been when we were unable or too late to stop communities being overrun with abuse.

There is another less known problem though, which is the number of rooms joined by matrix.org users that need to be investigated and taken down. Which is a problem unique to the operation of homeservers with public registration.

What should be the priority?

The priority is to talk to people who are at the front line, both for room moderators or the moderators of matrix.org. Their condition and experiences need to be accurately understood by the governing board. The current problems cannot be fixed by assuming what the problem is5.

We need to work with moderators and people who are building communities and understand what their problems are. We also need to make sure this is done with technical and structural insight too, so that it's possible to identify when there are more fundamental underlying problems. Which room moderators alone might not be able to work out alone.

This is constant work that requires a constant analysis of the condition of the entire ecosystem, all of its participants, and and all its influences.

But again, why should room moderators be the priority?

It does not have to be, the foundation could decide that open federation is too much work, and the protocol and its vendors are heading in too much of a diverged direction. The use case for Matrix in the public could move away from being something like discord, slack, and IRC to being something that's used in more private contexts, like whatsapp or signal.

Key takeaways

  • Market conditions and vendors cannot be relied upon to provide trust and safety tooling for Matrix's open federation. This is work that requires continuous analysis and effort to stay on top and not fall behind as the ecosystem and related technologies change. Vendors have conflicting interests that they must prioritise, and cannot be relied upon to maintain personnel and expertise.
  • Matrix.org's greatest liability is the homeserver itself, and the challenges that the the matrix.org abuse team are facing are not public or understood.
  • Community focussed tooling such as Draupnir depends upon feedback but is blind to the needs of the foundation.
  • A body needs to be in place to help the governing board and developers to fully understand both the needs of room moderators from across the open federation and the matrix.org abuse team. A triage system similar to Draupnir's that will assess the greatest value of issues and their relative cost can help with planning improvements. Decisions can then be made with with the confidence that they will be effective.
  • Future MSCs must demonstrate their impact on room moderation. Where necessary, we need to be confident that future MSCs will not exacerbate room moderation concerns, by requiring relevant projects to be included as a required implementation.
  • The foundation needs to assess whether it is sustainable to promote Matrix for use in an open-federation context. This is what most of us mean when we say "the matrix community", and it would mean saying goodbye. But this is a big fight, and currently it has not been working without overworking people.

Footnotes:

1

The reason being is that the default power level provides a room member with the ability to post the same types of messages and content at the same rate as anyone else (ambient authority).

By default, servers are also provided with the authority to join any number of users they want to, and send any number of messages.

That's just in the design of Matrix's main feature, the room. That's without mentioning reporting, discovery, invitation flows, or whatever else is causing issues right now. Or the monsters hiding inside matrix.org's massive public server.

2

This is actually worse, as because vendors do not consider the open federation with the same weight, without careful consideration from others some of their proposals for the protocol can actually exacerbate the current situation. For example distributed and pseudo identity will be a huge problem unless this work is coordinated with the community. Including being confident their anti-abuse tooling will still work by having them be required implementations.

There will be many other proposals, future and current, that need to be considered this way.

4

Due to the ease of registration, but also because the server is hard to ban, due to the number of users who originate from there. Recently on CME this trend has improved, but unfortunately the number of daily bans has increased significantly again overall.

5

Draupnir doesn't have much insight from the moderators of matrix.org, only room moderators who are connected to Draupnir or the community moderation effort. Draupnir's triage system also helps signal what the biggest problems are for us in terms of usability. Beyond the remaining structural concerns Draupnir has to respond to, the priority for us right now is simple.

The Matrix room needs to be looked at, particularly the features that new users to a community are given access to implicitly and by default (ambient authority). Which they are then using to disrupt the room. Specifically this would mean their ability to send links, upload images, and mention users.

We will be doing this anyways once 2.0.0 is released (or sooner) and be enforcing ad-hoc rules with redaction events.

I have MSCs lined up for this, and also MSCs lined up for other things.