#PrivacyCamp23: Event Summary

In January 2023, EDRi gathered policymakers, activists, human rights defenders, climate and social justice advocates and academics in Brussels to discuss the criticality of our digital worlds.

We welcomed 200+ participants in person and enjoyed an online audience of 600+ people engaging with the event livestream videos. If you missed the event or want a reminder of what happened in a session, find the session summaries and video recordings below.

In 2023, we came together for the 11th edition of Privacy Camp, which is jointly organised by European Digital Rights (EDRi), the Research Group on Law, Science, Technology & Society (LSTS) at Vrije Universiteit Brussel (VUB), the Institute for European Studies at Université Saint-Louis – Bruxelles (IEE at USL-B), and Privacy Salon.

What is the topic that brought us together?

Invoking critical times often sounds like a rhetorical trick. And yet, in 2022, we witnessed the beginning of both an energy and security crisis, caused by the Russian invasion of Ukraine. In the meantime, the world is still dealing with a major health crisis, while increasingly acknowledging the urgency of the climate and environmental crisis. In fact, crises are situations where the relations in which we are entangled change, so that understanding and making an impact on how these relations change, and in favour of whom, becomes crucial.

In this context, we asked ourselves the following questions: How do digital technologies feed into and foster the multiple crises we inhabit? What do we need to consider when approaching the digital as a critical resource that we should nurture, so as to promote and protect rights and freedoms?

Check out the summaries below to learn about the role of digital technologies in ongoing world crisis and what are the current efforts to build a people-centred, democratic society!

Contents


Each year, Privacy Camp is made possible thanks to the generous support of our partners and individual donors like you. Your donations help us to maintain the conference free of charge for all digital rights advocates, activists, academics and all other participants from Europe and beyond.


Reimagining platform ecosystems

Session description | Session recording

In the context of the upcoming implementation of the Digital Services Act (DSA) and Digital Markets Act (DMA), the panel discussed their vision of a sustainable and people-orientated platform ecosystem. Ian Brown pointed out to the need for platform business innovation about interoperability. Chantal Joris expressed the vision of having more open space for healthy debate at the expense of the current monopoly of a few companies deciding on what information users can access. Jon von Tetzchner prioritised the urgency of banning current data collection and profiling practices companies are using. And Vittorio Berlota concluded that the future challenge we are set against is finding a middle ground between not over-regulating the internet and preventing companies from “conquering the world, and conquering knowledge”.

The panel spent some time to talk about interoperability, noting that it can advance the goals of the General Data Protection Regulation (GDPR), i. e. improving data protection by empowering users to choose more privacy friendly services which in turn should create market pressure on all market participants. Concerns were expressed about the current regulatory approach giving more control to companies that rather restrict users’ speech and about companies’ ability to circumvent interoperability obligations.

The audience raised the question about law enforcement bodies and how they relate to interoperability. The speakers highlighted how a limited number of platforms would equally be beneficial for law enforcement since it would ultimately facilitate control – whereas decentralisation and the existence of a larger amount of players would make government control more difficult.

The rise of border tech, and civil society’s attempt to resist it, through the AI Act’s eyes

Session description | Session recording

The session focused on the currently negotiated Artificial Intelligence Act (AI Act) and how the legislation is failing to protect people on the move despite civil society’s recommendations. Simona De Heer highlighted the lack of clarity and transparency in the European Union’s and Member States’ development and uses of technology in migration. This technology is used to profile, analyse and track people on the move all the time. Prof Niovi Vavoula explained that these AI systems are inherently biased, creating a power imbalance and divide between foreigners and non-foreigners. While there is no ban on tech uses in migration, discriminatory practices are on the rise like emotion recognition and behaviour analysis, predictive analytics, risk profiling and remote biometric identification in migration.

Alyna Smith shed light on the impact such uses of AI-powered technologies have on undocumented people as ID checks have become part of a strategy to increase deportations. An underlying element of these practices is racial profiling and criminal suspicion against migrants. Evidence shows that 80% of border guards say that ethnicity was a useful indicator. Hope Barker highlighted that the issue is not the tech itself but the uses of tech for harmful ends. For example, reports show that Frontex officers have taken images and videos of people on the move without consent. Or cases of drones collecting a vast amount of data without people’s knowledge of how the information will be used in countries like Croatia, Greece, North Macedonia, and Romania.

Workshop: Policing the crisis, policing as crisis: the problem(s) with Europol

Session description

The workshop engaged the audience with a series of questions about the nature and powers of Europol. Chris Jones pointed out that the last reform gave Europol many tasks that they were already executing, including processing personal data. The agency is becoming a hub to receive data from non-EU countries and may be involved in “data laundering” (using data extracted from illegal actions such as torture). Laure Baudrihaye-Gérard added that the lack of scrutiny around how Europol is processing data and unclear legal understanding if the agency can be sanctioned for misbehvaiours puts thousands of people under mass surveillance without the right to fight back.

Fanny Coudert underlined that the new reform will make it very difficult for the European Data Protection Supervisor to limit the data processing by Europol, especially in the case of large databases. Saskia Bricmont revealed that there was no strong opposition in the European Parliament or the Council during the Europol reform negotiations. Civil society’s actions had an impact on amending and ensuring some scrutiny during the reform, which, however, was rejected in the end.

Sabrina Sanchez spoke about the attempt to make “prostitution” a European crime in the VAW Directive. Since the Europol system is very opaque sex workers don’t know if they are in database where organising as sex workers is criminalised. Romain Lanneau finished with a Dutch case in which the data of a leftist activist was used by Europol. The case is the tip of the iceberg given the lack of scrutiny of Europol. Romain invited the audience to request data from Europol. See more here.

Contesting AI & data practices. Practical approaches to preserving public values in the datafied society

Session description | Session recording

The session investigated the practical implementation of data ethics from academic, business and legal perspectives. Iris Muis kicked off the conversation by presenting the tools (Fundamental rights and algorithms assessment and Data Ethics Decision Aid) Utrecht University developed and how they were used in the government sector. Joost Gerritsen made the point that implementing data practices that preserve public values should be profitable. He stressed that it’s important to acknowledge that GAFAM is not representative of the whole of Europe as there are other companies relying on artificial intelligence.

Willy Tadema took us through the recent historical development of AI and how governments relate to using the technology. Willy noted that nowadays governments are reluctant to use AI without sound reasoning. The discussion raised the question if we shouldn’t focus on the need of building an algorithmic system in the first place and then look at data ethics. The panel then followed to discuss the issue of putting the responsibility and “burden of moral decision” on the tech team. Willy pointed out that we should have all key stakeholders in the room from the very beginning, including those affected by the algorithm.

Critical as existential: The EU’s CSA Regulation and the future of the internet

Session description | Session recording

Ella Jakubowska started the panel by introducing the debate surrounding the CSA proposal which has been subject to criticism from a broad coalition of experts ranging from legal experts to computer scientists. The debate was kicked off with the participants articulating their understanding of ‘critical‘ and ‘digital‘. It would imply understanding the opportunities and limits posed by technology (Patric Breyer), they would reflect key principles of the CCC, i.e. computers can and should be used to do good, but at the same time mistrust in authority and decentralised systems were key (Elina Eickstädt), they implied that digital policy is also social policy and that we should thus be aware of existing power structures and their effects on vulnerable groups (Corinna Vetter).

Elina Eickstädt stressed that there is the belief among lawmakers that it was possible to undermine encrypted communication in a secure way would be an illusion. Encryption would be binary: either you have it, or you don’t. Screening content before it is encrypted via client-side scanning would break with the principle of end-to-end encryption, and that users are in total control of who can read their messages.

Corinna Vetter highlighted the social and labour policy issues of the CSA. Since no algorithm exists yet that could reliably detect CSA content, it would have to be done by people who would get access to private communications. Furthermore, we would already know from current content moderation practices that it is done by workers who suffer from very poor working conditions and the psychological impact of the content they have to review. Patrick Breyer presented how the CSA proposal would create a new and unprecedented form of mass surveillance.

Workshop: Police partout, justice nulle part / Digital police everywhere, justice nowhere

Session description

The workshop started by focusing on trends of tech surveillance and harms to racialised communities. Dr Patrick Williams highlighted that currently the UK is facing problematic situations and crises in the police institution and what we see in this context is that institutions trying to distance themselves from the issue. Chris Jones spoke about the 2015-2016 period of people on the move coming to the European Union and the institutions’ resort to control and tracking of people as a reaction. He underlined that the problem is not the technology but power and pointed out that the materialisation of this approach on an international scale is global policing strategies.

Itxaso Domínguez de Olazábal outlined the role of Israel in the development of the techno-solutionist approach as they have built a leading surveillance industry, testing new technologies on Palestinian people. Itxaso explained that this system of extraction and testing underlines capitalist racism. Laurence Meyer explained that police and prison are not effective as there are designed to enforce order, not reduce crime or increase safety. What we have seen is that digital policing tools are both discriminatory and criminalising, impacting people who exist outside of the hegemonic structures.

The discussion concluded with a strong call to action tofind cracks in the policing and criminal system and create our resistance there, to not be paralysed in the face of these seemingly perfect and infallible technological structures. We can work towards non-reformist reforms, reducing the power of the police, going beyond technology and posing a political question about abolition.

In the eye of the storm: How sex workers navigate and adapt to real – and mythical – crises

Session description| Session recording

Kali Sudhra started the conversation by outlining the context that the COVID-19 pandemic created for sex workers. Sex workers were faced with less space in their community, increased police encounters and more people moving online to do their work. However, the online environment brought more risks, including financial discrimination by platforms, as sex workers have to abide by the terms of service which are often discriminatory, require private info disclosure (e.g. PayPal). Sex workers also experience online censorship, a consequence of racist algorithms, meaning many cannot advertise services, pushing sex workers to the margins.

Yigit Aydinalp spoke about the role of private actors as enablers of harmful legal frameworks. In the Digital Services Act, the Greens introduced an amendment on non-consensual imagery, which means that hosts of content would have to collect users’ phone numbers. This violates the data minimisation principle, especially when working with marginalised communities. Sex workers were not consulted on this. In the Child Sexual Abuse Regulation proposal, we also see the over-reliance on tech solutions in response to another crisis, resulting in more surveillance to marginalised communities.

Saving GDPR enforcement thanks to procedural harmonisation: Great, but how exactly?

Session description| Session recording

Lisette Mustert focused on the cooperation between the European Data Protection Supervisor (EDPS) with national authorities to highlight how cross-border handling of data occurs and express the need for a new set of rules that will clarify the existing uncertainties. To reach a consensual outcome, lead authorities need to assist each other. The cooperation process is complex and slow and it may deprive parties of their procedural rights, which does not lead to protection of digital rights or admin rights or defence rights based on how the GDPR system is designed.

Gwendal Le Grand explained that not all authorities are always in agreement in terms of the interpretation of the law and decisions. Then we enforce dispute enforce mechanism. To that, Romain Robert gave the example of the Meta complain EDRi member noyb submitted and the many procedural issues along the way. Maria Magierska highlighted the capacity and resources limitations of Data Protection Authorities (DPAs), which should be seen as a structural problem and not an individual problem.

In their wishes of what could solve the harmonisation issue of the GDPR, the speakers mentioned the implementation of the principal of good governance, implementation of the full law, DPAs to work as fast as the EDP, make the regulation as clear as possible.

Workshop: The climate crisis is a key digital rights issue

Session description

Jan Tobias introduced the discussion outlining the question of the link between climate crisis and digital infrastructure. Harriet Kingaby focused on disinformation economy and climate disinformation, pointing out the harms advertising tools create for society. Narmine Abou Bakari presented some empirical evidence of how tech companies negatively impact the environment. For example, in 2020 tech companies consumed 9 percent of our global electricity and are right now far away from the net-zero metric. Another comparison showed that cryptocurrency mining energy consumption is equal to whole Argentina country energy consumption.

Narmine also emphasised that we need repairable devices, urging for advancing people’s right to sustain devices, choose software, transparent communication from companies and give users more control over data and software.

The speakers also discussed the questions of whose justice we are prioritising when we speak about climate crisis from digital perspective and how we can engage in these spaces.

Solidarity not solutionism: digital infrastructure for the planet

Session description | Session recording

The panel started with an overview of the historical relation between technology and the climate. Some examples link to surveillance of movements and land defenders; access to information and spreading disinformation; and extractive nature of corporate technology practices. Paz Pena focused on geopolitical ethics and the exploitation of nature by digitalisation and globalisation; and who are the communities as well as non-humans paying the price for the “solutions” to the climate crisis that have been put forward by companies.

Following that the conversation continued by revealing some of the false solutions given to climate crisis and their links to digital rights. Becky Kazansky spoke about carbon offsetting and how institutions, companies and individuals use it to compensate for their carbon footprint. Ecology is not a balance sheet, so it’s important that we treat climate pledges by tech and other companies in the same way as other digital rights issues. Lili Fuhr added that the way we define the problem dictates the solution – and what we are seeing is that tech is being brought to ‘fix’ climate and made to ‘fit’ the problem. The discussion concluded with several critical points suggesting that even though Big Tech contribute to the climate crisis, what we need to fight against is the hegemonic logic of technocapitalism as a ‘solution’ to ecological crisis. A social justice problem cannot be distilled to a feel-good practice for European consumers.

The EU can do better: How to stop mass retention of all citizen’s communication data?

Session description | Session recording

Data retention is a topic that comes and goes. Plixavra Vogiatzoglou spoke about the judicial legislation of mass data retention and some challenges that arise from it. How can we distinguish if the interference is serious or not. Data retention on itself regardless of any harms or sensitive information collection constitutes interference and conclusion is even more justified when private and sensitive information is included. Interference should be assessed as serious by the fact that this vast amount of data being collected amplifies quote significantly the power information asymmetries between the citizens and the government. Second challenge refers to the differentiation between public and national security, which seems to be made on the basis of immediate and foreseen threats. And third challenge that was presented referred to making a balance between the entering of new technology and avoiding the facilitation of mass surveillance.

In Belgium, there are currently three data retention laws, subjecting people to mass surveillance. In Germany, there is space to establish an alternative to the mass data retention approach. Furthermore, the panel discussed how the European Court of Justice reacts to the pressure coming from the Member States and what actions could be taken to defend fundamental digital rights.

Workshop EDPS Civil Society Summit: In spyware we trust. New tools, new problems?

Session description

Wojciech Wiewiórowski kicked off the discussion with a reflection on the way member states behave when matters refer to national security, recognising that many states rely on tools like spyware. He highlighted that one of the major issues is that national states security exemption is not harmonised and the different duties that Data Protection Authorities need to perform nationally. Hence, the oversight of the European Data Protection Supervisor is very important.

Rebecca White pointed out that civil society is burdened with the task of proving that untargeted mass surveillance is not the way to ensure security. Eliza Triantafillou spoke about Predator spyware investigations and the challenge of persuading the public that it is harmful to surveil journalists. For example, recent news revealed that a journalist was put under surveillance by the Dutch secret services for the last 35 years and now none of the protected sources wants to work with this journalist. Bastien Le Querrec added that other than Pegasus, there are many other uses of spyware that people are not familiar with.

The following part of the workshop took a fishbowl structure and allowed for intervention from the audience. The main issues that were raised were around national security, the efficacy of a ban on spyware, and commercial use of spyware.


Thank you to this year’s Privacy Camp sponsors for their support! Reach out to us if you want to donate to #PrivacyCamp24.


Author
Viktoria Tomova
Communications and Media Officer
Twitter: @tomova_viktoria

#PrivacyCamp22: Event Summary

The theme of the 10th-anniversary edition of Privacy Camp was “Digital at the centre, rights at the margins” and included thirteen sessions on a variety of topics. The event was attended by 300 people. If you missed the event or want a reminder of what happened in a session, find the session summaries and video recordings here.

#PrivacyCamp22: Livestream

Parallel sessions at the event will take place in 2 rooms: Alice & Bob. This year, the Privacy Camp conference will also be live-streamed. So, in case you did not register or you struggle to connect to the room, you can follow the livestream here.

The sessions will be recorded and shared following the event. So in case you miss a session, you can watch them later.

Final schedule

#PrivacyCamp22: Final Schedule

The final programme is here! Find the full schedule of sessions and speakers for the 10th anniversary edition of Privacy Camp on 25 January 2022.

View/download the complete schedule in print format here (pdf).

Digital at the centre, rights at the margins?

After 10 successful editions and serving as the flagship annual event for digital rights enthusiasts in Europe, this year Privacy Camp will reflect on a decade of digital activism, coming together to discuss the best ways to advance human rights in the digital age.

Note that all times are in CET.


Registrations are free, so register now! The deadline to register is 16:00 CET, 24 January.

https://privacycamp.eu/2021/12/07/privacycamp22-registration-now-open/


#PrivacyCamp22: Draft programme schedule

Digital at the centre, rights at the margins?

Privacy Camp turns 10 in 2022 and it’s time to celebrate!

The special anniversary edition of Privacy Camp 2022 will be the occasion to reflect on a decade of digital activism, and to think together about the best ways to advance human rights in the digital age.

Check out the draft schedule now to find out what you can expect from the 10th edition of Privacy Camp. The event will take place online on 25 January 2022 from 9.00 to 17.30 CET.

The event will use an open source and privacy friendly tool called Big Blue Button. Participants will receive instructions on how to join prior to the event.

Sessions

In response to our call for panels, we have received proposal tackling the subject from a variety of perspectives.

We are proud to present the sessions that our content committee and the committee’s advisors have selected:

  • Connecting algorithmic harm throughout the criminal legal cycle
  • Regulation vs. Governance: Who is marginalised, is “privacy” the right focus, and where do privacy tools clash with platform governance
  • Centring social injustice, de-centring tech: The case of the Dutch child benefits scandal
  • Regulating tech sector transgressions in the EU
  • A Feminist Internet
  • Drawing a (red) line in the sand: On bans, risks and the EU AI Act
  • How it started / how it is going: Status of Digital Rights half-way to the next EU elections
  • Stop Data Retention – now and forever!
  • The DSA, its future enforcement and the protection of fundamental rights
  • Surveillance tech as misclassification 2.0 for the gig economy?
  • Ministry of Microsoft: Public data in private hands

The final schedule will be published at the beginning of January.

More information

The 10th anniversary edition of Privacy Camp offers a forward-looking retrospective on the last decade of digital rights. This online edition aims at building on the lessons of the past and at collectively articulating strategic ways forward for the advancement of human rights in the digital society.

Emerging intersections within the realms of regulating digitalisation as well as within other broader social justice movements point that – while some issues remain timeless – the power struggles ahead might happen on new terrain(s). How can we adapt to these new terrains, while drawing on a decade’s worth of lessons? How can we organise with broader groups of people and other communities? What are the points of reflection we must focus on, to address the wider impact of the digital rights’ fight?

Read more about Privacy Camp.

#PrivacyCamp22: Call for panels! (New deadline)

** UPDATE: The deadline is extended to 14 November 2021, 23:59 CEST.

Privacy Camp turns 10. It is time to celebrate. But Privacy Camp 2022 is also the occasion to reflect on a decade of digital activism, and to think together about the best ways to advance human rights in the digital age.

The 10th edition of Privacy Camp invites for a forward-looking retrospective on the last decade of digital rights. This edition aims at building on the lessons of the past and at collectively articulating strategic ways forward for the advancement of human rights in the digital society.

Emerging intersections within the realms of regulating digitalisation as well as within other broader social justice movements point that – while some issues remain timeless – the power struggles ahead might happen on new terrain(s).

How can we adapt to these new terrains, while drawing on a decade’s worth of lessons? How can we organise with broader groups of people and other communities? What are the points of reflection we must focus on, to address the wider impact of the digital rights’ fight?

Concretely, we want to explore ideas for (1) putting rights at the centre of digital policies, and (2) bringing marginalised perspectives to the core of digital rights discussions. In this spirit, we call for solution-oriented panel proposals around the following themes:

1. Putting rights at the centre of digital policies

Too often, rights are an after-thought of digital policies. In the past decade we have seen again and again decision-makers decide first, and think about the impact on digital rights later. How can this be changed, to have future policy decisions getting rights right from the start, notably in relation to automated decision-making, the platform economy, data protection and privacy of communications, and the surveillance infrastructure?

Notably, we invite proposals tackling questions such as the below:

  • What can we learn from national and EU debates around digital rights, that will be relevant for current and upcoming challenges?
  • At EU level, has there been an evolution in terms of better integration of fundamental rights concerns into policy-making and socio-technical design?
  • Has the changing role of EU institutions in relation to fundamental rights affected their approach to digital policy? Does it depend on the EU institution?
  • Halfway through its term, how is the European Commission standing in terms of digital rights and policies?
  • How do debates about EU digital policies intersect with the power of Big Tech and national states?
  • How to make sure that rights remain a central priority when legal instruments have been adopted and what is needed is to guarantee their effective enforcement (e.g. GDPR enforcement)?

2. Bringing marginalised perspectives at the core of digital rights discussions

The digital rights agenda was never neutral. It has been shaped over the years by a predominantly reactive approach to digital policy debates. Importantly, it also has its own dynamics dependent on a rather specific set of priorities. This means that some perspectives on digital rights, notably those coming from the point of view of marginalised people and communities, have been themselves marginalised. What are the voices and issues that have been left out, heard less, or simply not amplified enough?

Notably, we invite proposals tackling questions such as the below:

  • How have digital rights strategies and approaches suffered from a limited perspective in the past?
  • How can the digital rights community better centre the voices of people disproportionately affected by exploitative digitalisation, such as women, LGBTQI+ communities, racialised communities and people from the global south, people with disabilities, working-class people?
  • What are the lessons learnt from creating broader coalitions with other actors such as workers’ unions, groups advocating for women rights, LGBTQI+ rights, anti-racism movements, or migrants’ rights defenders?
  • What can we learn from how marginalised groups have been affected by digitalisation, and what effects have legal frameworks had to counter this disproportionate impact?
  • How can we make sure that when we put rights at the centre of digital policies the concerns of marginalised people are given the necessary space?
  • How might the digital rights field incorporate transformative justice and decolonial perspectives into its work?

Deadline for panel proposal submissions: 7 November 2021

Background

The past decade brought the increased digitalisation of all aspects of our life. This process has led to a growing production of data in digital formats, be they personal or non-personal data, data related to content or metadata, and often sensitive data because of their nature or because of how they are processed.

In this context, corporate and government entities have gained unprecedented power. Internet services and digital technologies have developed in often inadequate and insufficient regulatory frameworks. As a result, many have been excluded from the benefits of the digitalisation process.

In the realm of the internet, but also beyond it, our societies have seen the rise and normalisation of government mass surveillance and surveillance capitalism, with Big Tech power grabbing from all areas of public life including public services. Connected to this trend, public and political debates have often been centred around securitisation arguments, and policy-making focused on counter-terrorism measures, and border and migration surveillance.

Against this tide, civil society, along with academia and some policymakers have worked together to curtail the harms of data exploitation and promote regulatory frameworks that put human rights at their centre.

In the past 10 years, Privacy Camp has become a forum that facilitates discussion and debates, and that offers occasion to coordinate and strategize better. It has foregrounded issues concerning EU data protection law, online content regulation and platforms’ power, the confidentiality of communications and the regulation of emerging technologies such as Artificial Intelligence (AI), among others.

In 2020 and 2021, the public debate has been dominated by the impact of harmful online content, the rise of biometrics mass surveillance and, once again, the fake dichotomy between rights and security. Furthermore, the COVID-19 pandemic highlighted our society’s dependence on digital technologies and on the actors that control them, as well as the role of individuals in facilitating or preventing access to data. It thus pushed legislators to focus on the need to further regulate digitalisation, and even re-ignited their aspirations to achieve a so-called ‘digital sovereignty’, of unclear contours.

The digital rights field composition, organisational practices and methods, however, have often left the people most affected by harmful uses of technologies outside of policy, advocacy or litigation work. This has resulted in siloed approaches to human rights in the digital age, or to overlooking the impact of digital infrastructure on marginalised groups and the planet itself.

With this edition of the Privacy Camp, we want to move beyond empty calls to put ‘the (undefined) human at the centre’ into a genuine taking into account of digital rights.

Submission guidelines:

  • Indicate a clear objective for your session, i.e. what would be a good outcome for you?
  • Include a list of a maximum of 4 speakers that could participate in your panel. Ensure you cover academia, civil society and decision–makers’ perspectives. Let us know which speaker(s) has/have already confirmed participation, at least in principle.
  • Make it as interactive as possible and encourage audience participation.
  • Support diversity of voices among panelists and strive for multiple perspectives.
  • Note that the average panel length is 50 minutes.

To submit a proposal, fill in this form by 14 November 2021, 23:59 CEST.

After the deadline, we will review your submissions and will notify you about the outcome of the selection procedure before 29 November. Please note that we might suggest merging panel proposals if they are similar or complement each other.

About Privacy Camp

Privacy Camp is jointly organised by European Digital Rights (EDRi), Research Group on Law, Science, Technology & Society (LSTS) at Vrije Universiteit Brussel (VUB), the Institute for European Studies at Université Saint-Louis – Bruxelles (IEE at USL-B), and Privacy Salon.

In 2022, Privacy Camp’s Content Committee are: Andreea Belu (EDRi), Gloria González Fuster (LSTS, VUB) and Rocco Bellanova (IEE, USL-B)

Privacy Camp 2022 will take place on 25 January 2022 online.

Participation is free and registrations will open in December 2021.

For inquiries, please contact Andreea Belu at andreea.belu(at)edri(dot)org.