Take responsibility for social media
With many former Twitter users looking for alternatives, the decentralised Mastodon platform is on the rise. But decentralisation alone is not enough: institutions should take responsibility and host their own Mastodon server.
Ever since Elon Musk took over Twitter, there has been a steady stream of Twitter users looking for alternatives, such as Mastodon. This alternative is part of a larger federation of social media services called the fediverse, which includes not only a Twitter-like platform such as Mastodon, but also Instagram-like photo sharing and TikTok-like video sharing platforms for example. The key idea is that this social media infrastructure is decentralised, having no central authority that oversees or manages everything. Any user on any part of the federated social media can follow any other user, across services, platforms and servers. Perhaps the easiest to understand analogy is email: from a personal mail address like firstname.lastname@example.org you can reach anybody else, regardless of whether they have a Gmail address, a Hotmail address or another institutional mail address.
Trust and moderation
You might wonder what the benefit of this decentralised alternative is over established social media such as Twitter, Instagram or TikTok. We believe that part of its potential benefit is that it addresses two large problems of social media: trust and moderation. Social media plays an increasing role in societal debates, and we have a special interest in understanding the role of science in such debates. Social media suffer from various problematic aspects. There are concerns of the spread of misinformation, concerns of large swaths of bots and “fake” accounts, and of attacks on individual users. Although social media have increasingly tried to deal with this, it remains a rather daunting task.
Addressing problems of trust and moderation requires more than just decentralised social media. At the moment, many Mastodon servers are run by volunteers, and some servers reached their limits under the strain of the millions of users migrating from Twitter to Mastodon recently. Although such volunteer activity is supported by generous donations, this is unlikely to scale to the hundreds of millions of Twitter users. Moderation is also run by volunteers, and scaling this to many millions of users is very challenging, running into exactly the same problems that Twitter and Facebook are facing. These established social media employ hundreds or even thousands of moderators, while problematic messages keep flowing through the cracks. Moreover, moderation should not be in the hands of the few, and Twitter or Facebook should not unilaterally get to decide for the world what should be allowed and what should not.
We propose that institutions should step up and take responsibility and host their own servers. Institutions, like universities, research centres, newspapers, publishers, broadcast companies, ministries and NGOs, all have a role to play in shaping the discussion on social media, without any single institution being in control. We believe that institutions setting up their own servers brings three benefits.
First, by hosting their own servers, institutions contribute to trust and verification of users. Many institutions already have established and verified domain names, such as cwts.nl for our own institution, and this helps to establish a trusted presence on social media. Within the federated social media, accounts would be clearly associated with that domain name, for example @email@example.com, clearly establishing that these users belong to that institution. Institutions can limit users to staff members only and verify their identity, thus establishing a trusted presence. Users may benefit from the institutional connection, and as some have argued, this could establish an organisation as a trustworthy brand, making Mastodon a more suitable platform for institutions than Twitter, instead of less.
Secondly, institutions with their own servers would contribute to the moderation of social media. Institutions need to take responsibility for moderating the behaviour of users of their own server. This means that institutions should implement a clear moderation policy for their social media presence. This is how it should be: different contexts may require different moderation policies. For example, researchers may have different obligations and responsibilities than the general public. This way, institutions could help ensure that debates take place more respectfully. At the same time, institutions would also help to ensure that people are able to express themselves freely and safely.
Thirdly, establishing institutional servers makes decentralised social media sustainable. As we already noted, it is unlikely that the current system will scale to millions of people based on voluntary contributions. By stepping up and providing their own servers, institutions provide a critical part of the necessary infrastructure. Not only in monetary terms, but also in terms of time investment for verification and moderation.
This is why at CWTS we have now launched our own social media server at social.cwts.nl. Only people who are affiliated with CWTS can register for an account on this server. In practice, this means that CWTS staff members can register for an account using their institutional email address. We have written a moderation policy, and set up a moderation committee that will advise on any violations of this moderation policy to the management. The management would not moderate messages directly, and only rely on the moderation committee, thus providing some necessary checks and balances. We trust CWTS staff members to behave responsibly in accordance with this moderation policy. We will not actively check every message being posted, but reported messages will be followed up on.
Does this initiative solve all problems of social media? Surely not. Institutional users presumably represent only a small minority of social media users. How the large public will find its way on Mastodon is not yet clear. This initiative will not directly reduce societal polarisation, but it may help establish a trusted social media presence of researchers who can engage in societal debates. It also expands the breadth of social media platforms that researchers can use to interact with broader communities. We believe that setting up our own server is a step forward, and we hope to see other institutions taking similar initiatives.