The summerschool starts on Sunday July 6, 2025 with a welcome reception (including a pasta buffet) between 19:00 and 20:30. Tuesday there is a BBQ followed by a 'rump session' (that allows students to present their - unfinished - research in an informal setting). On Thursday evening there is a social event in downtown Nijmegen. The summerschool ends on Friday July 11 after lunch.
The summerschool is comprised of nine lectures of two hours each (with a 15 minute break in between) and time to work on a case study (in groups of 4-6 students).
Breakfast and lunch are available in the hotel restaurant, and are self-service. Dinner is served in the hotel restaurant as well.
Sjoera Nas: EduGenAI: privacyfriendly access to multiple LLMs
Sjoera Nas will discuss a Data Protection Impact Assessment (DPIA) on SURF's new EduGenAI service. SURF is the Dutch IT cooperative of education and research. In the past, SURF for example invented Eduroam, now available in more than 100 countries globally.
EduGenAI is a web application that will offer easy access to multiple LLMs, both public cloud models from providers such as OpenAI and LLama, and open source models that can be hosted locally in SURF's AI Hub in Amsterdam. The service is designed to minimise data processing impact, and to enhance data sovereignty. The service will be launched gradually from 1 September 2025 onwards. In her talk, Sjoera will explain the concept of an umbrella DPIA as a tool to encourage suppliers to take privacyfriendly measures. Umbrella DPIAs can be performed as a kind of hostile audit. The Dutch government and SURF have published many DPIAs she wrote on Big Tech cloud services with remaining high risks. The (possibility of) negative publicity frequently motivated Big Tech to make global changes. In this case, the DPIA was written through a collaborative approach with the developers, resulting in many improvements prior to the launch.
Seda Gürses: The PET Paradox in Computational Infrastructures
Recent applications of Privacy Enhancing Technologies (PETs) reveal a paradox. PETs aim to alleviate power asymmetries, but can actually entrench the infrastructural power of companies implementing them vis-à-vis other public and private organisations. We investigate how this contradiction manifests based on two examples. The first is the Google and Apple Exposure Notification (GAEN) implemented during the first year of the COVID pandemic. With GAEN, Google and Apple successfully repurposed consumer smart phones they control into an infrastructure which, in cooperation with governments, they slotted into the heart of public health efforts. The second example is about Amazon’s cloud connectivity service called Sidewalk, which repurposes Amazon smart home devices (e.g., Ring and Echo) into gateways to provide connectivity to IoT manufacturers. Both examples show how powerful tech companies can instrumentalise PETs to restructure information flows towards outcomes that benefit them over governments and other businesses. They are able to do so due to their concentrated control over computational infrastructures, made up of clouds + end devices. The empirical examples will help us understand how this infrastructural power functions and will allow us to think concretely about whether and when PETs aim to constrain informational asymmetries can be an antidote to infrastructural power.
Seeta Peña Gangadharan: Negative policy infrastructure and digital rights: Probing limits and possibilities for pushing back against police tech in Europe
Over the last few decades, growing investments in the development and use of new technologies for tracking, targeting, and intercepting criminal activity throughout Europe has been met with human rights concerns for discrimination and bias in the policing of members of minoritized and racialized communities. Digital rights advocates, anti-racist organizers, and others routinely call out the digitalization of public safety with a range of strategies and tactics, though to varying degrees of success or influence. How do we understand this collection of interventions targeting states and private actors? Who benefits and who ought to benefit from this pushback? This talk addresses these questions through the lens of “negative policy infrastructure”—the space of political possibility created in the exceptions, contradictions, or negligence found in the digital rights landscape in Europe. Drawing from a range of efforts to pushback against police tech in Western, Central, and Southern Europe, this talk shows how negative policy infrastructure creates both uncertainty and opportunity for public and private actors to defend digital rights. To bridge between the theoretical and the pragmatic, the talk will invite students to engage with participatory tools designed to examine and assess political possibilities in the governance of police tech.
Thorsten Strufe: Navigating Privacy in the Age of Outsourcing and Observation
Privacy, especially in the evolving environment of opaque sub-contracting, outsourcing, and passive observations, is becoming an increasingly difficult goal to achieve. Informational self determination, the foundation of privacy, implies that an individual knows which information about themselves is shared when, with whom, and to which extent. With changes in applications, system architectures, and ever more immersive hardware this gets increasingly difficult and unwieldy.
In my lecture I will briefly introduce common system architectures, and subsequently introduce the basics of privacy from a technological perspective. We will touch on different disclosure risks and have a primer on privacy enhancing technologies from 30.000ft.
Tamar Sharon: The moral limits of digitalization: A sphere-centric approach for digital society
As we transition to digital society, the key societal spheres that make up society are being transformed. In the process, we may relinquish democratic control over spheres which increasingly rely on privately owned digital infrastructure for their proper functioning, while the values, forms of expertise and ends that these spheres have traditionally embodied and sought to realize are at risk of getting lost. Yet, dominant approaches for addressing digitalization risks – focusing on either data protection or competitive markets – do not seek to protect spheres. In this talk I propose a novel, sphere-centric framework for the moral evaluation of digitalization, which seeks to protect the autonomy, integrity and plurality of spheres in digital society.
Velislava (Veli) Hillman: The digital classroom: governance, power and the public interest
In the name of efficiency, inclusion and ‘21st-century skills’, ‘big’ and ‘ed-tech’ businesses have encroached on public education globally. But behind the branding of digital transformation lies an unsettling truth: public schooling is becoming a new frontier for surveillance capitalism and commercialisation. This lecture exposes how ed-tech and AI vendors – often unaccountable, venture-backed and profit-driven – infiltrate classrooms as the new ‘pedagogic authorities’ and influence how children learn, what they learn and who gets to own their data. We will ask: How do societies allow this? What happens to democratic accountability when children’s learning becomes a market opportunity? Through a critical excavation of ownership structures, profit motives and digital policy shortsightedness – we’ll reveal how the digital classroom has become a commercial and political arena, where the core purposes of education are increasingly shaped by corporate logics rather than public values. This session is participatory: students will join an interactive investigation of some of the digital (and AI) systems already in use, confront their origins and wider corporate networks, and explore approaches grounded in human rights, transparency and meaningful governance committed to public education.
Joris van Hoboken: Where there is a will there's a way?
Public sector institutions across the world grapple with the impact of the digital transformation on their relationship with citizens. In this contribution, we will discuss two case studies from the Netherlands that can shine some light on the practical legal dimensions of this problem. The first case study relates to the municipality of Amsterdam, which has over the last years set new policies with respect to all the data that is gathered of people in the public space in the city. The city wished to improve the effective enjoyment of privacy in the city on the one hand, and ensuring due access to data for public purposes on the other hand. The second case study is the work of the University of Amsterdam to develop the concept of Digital Sovereignty from the perspective of higher research institutions. In this context it had to take account of its policies relating to cloud and related services, but also the avalanche of new European legal rules relating to data and digital services. Both the case studies demonstrate a clear wish to create new policies and positions in relation to big tech driven change in our society and institutional settings, but also the significant challenges in navigating the legal dimensions when doing so in practice.
Cynthia Liem: Working as intended? Practical challenges in public accountability on potential algorithmic harm
As has become clear through many well-publicized examples, data-driven decision-making may cause algorithmic harm. However, knowing this, the open question remains on how to indeed learn from and mitigate for this. In this session, I will consider three well-publicized scandals in The Netherlands through these perspectives: the childcare benefit scandal of the Tax and Customs Administration, the welfare fraud algorithm of the municipality of Rotterdam, and the control process into possibly illegitimate student finance use for students living away from home of the Education Executive Agency of The Netherlands (DUO). In all three cases, I have been consulted as technical auditor/advisor by investigative journalists, or the organizations themselves. From these experiences, I wish to share lessons on how beyond algorithmic considerations, organizational incentives, governance and culture are of major importance in handling potential scandals. At the same time, the context of (potential) public visibility in itself may put pressure on the degree with which constructive dialogues can be held. As such, I seek to invite a discussion on how such constructive dialogues can be created.