Take responsibility for social media Header image by Brandon Mowinkel

Take responsibility for social media

With many former Twitter users looking for alternatives, the decentralised Mastodon platform is on the rise. But decentralisation alone is not enough: institutions should take responsibility and host their own Mastodon server.

Ever since Elon Musk took over Twitter, there has been a steady stream of Twitter users looking for alternatives, such as Mastodon. This alternative is part of a larger federation of social media services called the fediverse, which includes not only a Twitter-like platform such as Mastodon, but also Instagram-like photo sharing and TikTok-like video sharing platforms for example. The key idea is that this social media infrastructure is decentralised, having no central authority that oversees or manages everything. Any user on any part of the federated social media can follow any other user, across services, platforms and servers. Perhaps the easiest to understand analogy is email: from a personal mail address like v.a.traag@cwts.leidenuniv.nl you can reach anybody else, regardless of whether they have a Gmail address, a Hotmail address or another institutional mail address.

Trust and moderation

You might wonder what the benefit of this decentralised alternative is over established social media such as Twitter, Instagram or TikTok. We believe that part of its potential benefit is that it addresses two large problems of social media: trust and moderation. Social media plays an increasing role in societal debates, and we have a special interest in understanding the role of science in such debates. Social media suffer from various problematic aspects. There are concerns of the spread of misinformation, concerns of large swaths of bots and “fake” accounts, and of attacks on individual users. Although social media have increasingly tried to deal with this, it remains a rather daunting task.

Addressing problems of trust and moderation requires more than just decentralised social media. At the moment, many Mastodon servers are run by volunteers, and some servers reached their limits under the strain of the millions of users migrating from Twitter to Mastodon recently. Although such volunteer activity is supported by generous donations, this is unlikely to scale to the hundreds of millions of Twitter users. Moderation is also run by volunteers, and scaling this to many millions of users is very challenging, running into exactly the same problems that Twitter and Facebook are facing. These established social media employ hundreds or even thousands of moderators, while problematic messages keep flowing through the cracks. Moreover, moderation should not be in the hands of the few, and Twitter or Facebook should not unilaterally get to decide for the world what should be allowed and what should not.

Host server

We propose that institutions should step up and take responsibility and host their own servers. Institutions, like universities, research centres, newspapers, publishers, broadcast companies, ministries and NGOs, all have a role to play in shaping the discussion on social media, without any single institution being in control. We believe that institutions setting up their own servers brings three benefits.

First, by hosting their own servers, institutions contribute to trust and verification of users. Many institutions already have established and verified domain names, such as cwts.nl for our own institution, and this helps to establish a trusted presence on social media. Within the federated social media, accounts would be clearly associated with that domain name, for example @vtraag@social.cwts.nl, clearly establishing that these users belong to that institution. Institutions can limit users to staff members only and verify their identity, thus establishing a trusted presence. Users may benefit from the institutional connection, and as some have argued, this could establish an organisation as a trustworthy brand, making Mastodon a more suitable platform for institutions than Twitter, instead of less.

Secondly, institutions with their own servers would contribute to the moderation of social media. Institutions need to take responsibility for moderating the behaviour of users of their own server. This means that institutions should implement a clear moderation policy for their social media presence. This is how it should be: different contexts may require different moderation policies. For example, researchers may have different obligations and responsibilities than the general public. This way, institutions could help ensure that debates take place more respectfully. At the same time, institutions would also help to ensure that people are able to express themselves freely and safely.

Thirdly, establishing institutional servers makes decentralised social media sustainable. As we already noted, it is unlikely that the current system will scale to millions of people based on voluntary contributions. By stepping up and providing their own servers, institutions provide a critical part of the necessary infrastructure. Not only in monetary terms, but also in terms of time investment for verification and moderation.

CWTS initiative

This is why at CWTS we have now launched our own social media server at social.cwts.nl. Only people who are affiliated with CWTS can register for an account on this server. In practice, this means that CWTS staff members can register for an account using their institutional email address. We have written a moderation policy, and set up a moderation committee that will advise on any violations of this moderation policy to the management. The management would not moderate messages directly, and only rely on the moderation committee, thus providing some necessary checks and balances. We trust CWTS staff members to behave responsibly in accordance with this moderation policy. We will not actively check every message being posted, but reported messages will be followed up on.

Does this initiative solve all problems of social media? Surely not. Institutional users presumably represent only a small minority of social media users. How the large public will find its way on Mastodon is not yet clear. This initiative will not directly reduce societal polarisation, but it may help establish a trusted social media presence of researchers who can engage in societal debates. It also expands the breadth of social media platforms that researchers can use to interact with broader communities. We believe that setting up our own server is a step forward, and we hope to see other institutions taking similar initiatives.

4 Comments

Magnus Palmblad

These are all good points. I think what the CWTS has started is great, and may work well for the CWTS. However, the culture at my institution is quite different, for better or worse. We already have a social media team that moderates what employees may or may not express on social media, to the point of banning the use of plural pronouns in posts related to the institution or work performed here. Presumably, "we" could be interpreted as referring to the institution as a whole. However, Mastodon does not seem to be on their radar yet.

I agree having SURF host a Mastodon instance would not solve the moderation issues. But at least it could provide a stable and up-to-date server for those institutions that for one reason or another do not run their own, with a basic verification that the account holders are at least affiliated with a Dutch university or research institute.

Vincent Traag

Thanks!

Sorry to hear that the culture at your institution is quite different and more restrictive. I think it should be clear that most social media posts would not necessarily represent an official standpoint from an institution, even if "we" is used in the post. But this can of course be interpreted differently by other people. Your case does clarify that it is not necessarily the technology behind it that limits the expression on social media.

Although I would be in favour of having institutional servers, I would definitely welcome alternative choices as well. This may include a more general server, such as a SURF instance as you would propose, or the one currently run by akademienl.social. It could also be other disciplinary servers, run for instance by learned societies or academic societies. If people have such choices, it would limit the policing that organisations themselves can do, or otherwise people would simply join such alternative servers. If this happens regularly, it might even reflect badly on organisations.

Magnus Palmblad

On one hand, this is an interesting idea, and I have been thinking along the same lines. On the other, some organizations already police what their employees post on social media - especially when the employer is mentioned (or tagged). There may be good intentions behind such policies, but I know most of my colleagues would prefer to use an external server and make it clear that the views expressed are those of the account owner and not the employer. Personally, I follow people because I find that they have something interesting to say, not that they represent or are vetted by a certain employer. Perhaps as an alternative, we could petition SURF to create a Mastodon server open to all academics in the Netherlands, with the same authentication method they use for other services?

Vincent Traag

I like the idea of SURF hosting a Mastodon server, and it could indeed facilitate services similarly to what it does now (like SURFdrive).

However, separate from the technical infrastructure, the central problem is that of moderation. I don't think that an ICT service provider should moderate such platforms. Hence, this would require some representation from, for instance, universities and research centres. Having a single central committee that decides for all of the Netherlands might not be reasonable, and depending on the level of moderation required, not sufficient. So, in the end, we might still need to go down to the level of individual universities, perhaps faculties or even institutions. As a comparison, we also do not have a central ethic committee or research integrity committee, these are also arranged separately by individual organisations, which I think is more reasonable.

I do agree that checks and balances are needed such that organisations cannot police their employees beyond the agreed upon rules. We have tried to organise this on our own server with a separate moderation committee, see https://social.cwts.nl/about.

Add a comment