Because a few giant internet companies managed over the years to almost completely dominate the online communication and information space, it is a common representation among policymakers, citizens and the media to consider those privately owned platforms today’s main online “public space”. However, their ability to set the limits and the frame of online speech and debates of nearly half the world population is heavily criticized.
On one hand, they censor too much and on the other hand, they don’t protect marginalised users enough. Platforms have a hard time making sense of the differing needs and experiences of the demographics using their spaces, and social bias embedded in their algorithmic tools further complicates the issue. Several voices are calling for greater targeted measures to moderate and regulate speech online, ranging from increasing platforms’ responsibilities vis-à-vis user-generated content to creating new criminal offences specific to online harms. Others advocate for alternative ways to deal with harmful content and online content disputes that would require neither arbitrary private actors’ decisions nor the need for lengthy judicial proceedings. For example, former UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression David Kaye developed the model of a multi-stakeholder governance for content moderation in the form of Social Media Councils.
These alternative forms of content dispute settlement are unfortunately not enough addressed in the current political debate over platform regulation, even though they could become a third way. In order to build such alternatives, the field of restorative justice, which is a framework coming out of the prison abolition movement and some indigenous ways of dealing with harming communities, has been identified as potential inspiration for online challenges. Restorative justice tries to think about the needs of all actors involved: the person who’s done the harm, the person who’s been harmed, and other members of the community.
This panel aims to bring this discussion to the table and explore the ways how the principles of restorative justice can inspire the future of content moderation online.
- Moderator: Chloé Berthélémy, Policy Advisor, European Digital Rights
- Pierre-François Docquir, Head of the Media Freedom Programme, ARTICLE 19
- Amy A. Hasinoff, Associate Professor at the University of Colorado Denver (US)
- Alexandra Geese, Member of the European Parliament
- Josephine Ballon, Head of Legal, HateAid gGmbH
Check out the full programme here.
Registrations are open until 24 January, 2021 here.