Many internet services are designed to collect personal data and exploit established and novel marketing techniques to nudge users to abandon increasingly more of their undivided attention. While users may expect that, in return for their data and attention they will receive content tailored to their interests, what they get is content selected and moderated based on the services’ business interests, irrespective of user enjoyment or societal welfare. Indeed, many internet services enforce moral and societal frameworks that the target audience may neither be subject to, nor agree with. Instead of serving the needs of users and treat them as ends, add-driven services objectify these users as means for profit, reducing the users’ purpose to that of consumers who are to be manipulated to consume more and specific content at the choice of international corporations. Thus, in order to build respectful technologies away from structural exploitation, we must go beyond considerations of data privacy to examine the ways in which technology fails to meet users’ expectations for what they will receive in return for personal data and engagement. Specific issues for certain groups of society show that detrimental and discriminatory effects are pervasive and indicate that the underlying issues require novel approaches of regulation and community-driven platform governance.
Examples for these effects are:
- Children: Manipulative online services for children do not cater for their best interest, but may present a threat to their development and freedom of thought.
- Sex workers: Digital services frequently deplatform and censor discussions of sex and sex work preventing sex workers doing legal work from being visibile to broader society, and from having their own access to supportive community, harm reduction information, and digital financial services.
In this panel we will look at digital infrastructures reflecting the needs of these two groups, children and sex workers. Our analysis drives is driven by the understanding that a sole focus on privacy and data protection may not be the appropriate way to regulate digital platforms and to guarantee a safe environment for users. We will discuss different personal, legal, and technological aspects of personal safety, internet governance, and regulatory ideas beyond the General Data Protection Regulation and the Digital Services Act, to work towards new community-driven infrastructures that cater for intersectional justice. Specifically, we want to explore the boundary where the limits of regulation and community-driven privacy tools clash with platform governance.
- Moderator: Jan Tobias Muehlberg, Research Manager at imec-DistriNet, KU Keuven, BE
- Elissa Redmiles, Research Faculty, Max Planck Institute for Software Systems
- Tommaso Crepax, IT Researcher, Scuola Superiore Sant Anna
- Patricia Garcia, Assistant Professor, School of Information, University of Michigan, USA
- Laïs Djone, Board Member, Utsopi, Be
Check out the full programme here.
Registrations are open until 24 January, 2022 here.