Programme

The summerschool starts on Sunday June 25, 2023 with a welcome reception (including a pasta buffet) between 19:00 and 20:30. Tuesday there is a BBQ followed by a 'rump session' (that allows students to present their - unfinished - research in an informal setting). On Thursday evening there is a social event in downtown Nijmegen. The summerschool ends on Friday after lunch.

The summerschool is comprised of nine lectures of two hours each (with a 15 minute break in between) and time to work on a case study (in groups of 4-6 students). See the instructions here. The following cases will be studied:

Case study results will be presented on Thursday.

To work on the case studies, please find yourself a comfortable spot somewhere.

Breakfast and lunch are available in the hotel restaurant, and are self-service. Dinner is served in the hotel restaurant as well.

Sunday, June 25

19:00Welcome reception
20:30

Monday, June 26

08:00Breakfast
08:45Opening, Practical Information
09:00Round of Introductions
11:15Coffee/Tea Break
11:30Instruction and Soft Goals
12:45Lunch
13:45Lecture 1: Bart Preneel: Hiding in Plain Sight: Location Privacy for the IoT
16:00Coffee/Tea Break
16:15Lecture 2: Aphra Kerr: Finding Common Ground in the Design of IoT Applications in Everyday life
18:30Free Time
18:45Dinner
20:30Case study group assignments

Tuesday, June 27

08:00Breakfast
09:00Lecture 3: Michael Veale: Confidentiality Washing With PETs: A Little Guide for Big Platforms
11:15Coffee/Tea Break
11:30Work on case studies
12:45Lunch
13:45Work on case studies
16:00Coffee/Tea Break
16:15Touch base with own discipline from case study groups (lecture 4 cancelled)
18:30Free Time
18:45Dinner/BBQ
21:00Rump session

Wednesday, June 28

08:00Breakfast
09:00Lecture 5: Marc Langheinrich: Privacy in the IoT
11:15Coffee/Tea Break
11:30Work on case studies
12:45Lunch
13:45Work on case studies
16:00Coffee/Tea Break
16:15Lecture 6: Jiahong Chen: Designing the responsible future of smart homes
18:30Free Time
18:45Dinner
20:30Walk in the woods
21:30Evening programme: privacy karaoke?

Thursday, June 29

08:00Breakfast
09:00Lecture 7: Gianclaudio Malgieri: Measuring (vulnerability and) risks to fundamental rights: quantifying the unquantifiable
11:15Coffee/Tea Break
11:30Work on case studies
12:45Lunch
13:45Lecture 8: James Stewart: Negotiating anonymity in a augmented reality university campus – apathy and IoT
16:00Coffee/Tea Break
16:15Case study presentations
18:45Convene in lobby for social event
19:30Social event: diner in Downtown Nijmegen

Friday, June 30

08:00Breakfast
09:00Lecture 9: Ana-Maria Cretu: Methods to evaluate the privacy of data processing technologies
11:15Coffee/Tea Break
11:30Evaluation and handout of certificates
12:30Lunch
13:30End of summerschool

Social event

Thursday June 29 we go out for diner at restaurant "De Hemel" (Heaven) in downtown Nijmegen.

We convene in the lobby of the hotel at 18:45 to go by bus together. Bus tickets will be provided.

Abstracts

Bart Preneel: Hiding in Plain Sight: Location Privacy for the IoT

Currently about 20 billion IoT devices have been deployed and this number will likely double in the next 5 years. These devices bring an enormous potential in terms of new services, but at the same time creates an infrastructure in which every person and object is continuously geolocated. In this lecture we look at a number of cases for which privacy-friendly protocols have been developed that offer valuable (distributed) services. Indeed, it is possible to get the benefits from a widely deployed infrastructure without revealing highly sensitive data on the location of users. We consider as examples insurance pricing, road pricing, proximity detection (contact tracing) and object tracking.

Aphra Kerr: Finding Common Ground in the Design of IoT Applications in Everyday life

We live in a generalised culture of surveillance - from our homes to our cities - however the conditions of our engagement in this culture vary widely (Lyons, 2018). Surveys of the public in Europe consistently show that privacy, safety and transparency are significant issues in relation to the generalised acceptance of AI driven digital technologies (Kerr et al., 2020), while the same three issues were included in the seven key ethical issues from the High-Level Expert Group on AI HLEG-AI (Paladino, 2021). However, the implementation of these public expectations and normative principles has proved to be challenging, both for the designers of the services and their users. This is particularly the case in relation to digital services and applications and in relation to certain groups of users, including youth and children. The internet of things raises a number of ethical and social challenges from a design and user perspective. Finding common ground in relation to shared values, shared goals and transparent forms of participation will be crucial to their acceptance (Kerr and Kelleher, 2020). In this lecture we will explore examples from smart cities, the media and the internet of toys and discuss possible interdisciplinary methods and tools for creating common ground and addressing some of the ethical and social challenges raised.

Michael Veale: Confidentiality Washing With PETs: A Little Guide for Big Platforms

A combination of regulatory, software, hardware, and societal factors have led to a sharp growth in international interest around privacy-enhancing technologies (PETs). This interest is found just as strongly in governments, regulators, and platforms – but are they all talking about the same thing? In this talk, I will try to break down some of the ways in which PETs can be used and abused, drawing attention to the close connection between the platform business strategy and new opportunities for profiling and value extraction offered by PETs and confidential computing. These technologies are particularly interesting in light of the growing ubiquity of connected senses in our environment, and the control over the ability to ask questions of the sensors is likely to be one of the big challenges of our time. Their use highlights the importance of theoretical distinctions within and around the concept of privacy which create tensions between different disciplinary perspectives. We will take a tour through some of the legal rights that may apply, and consider some of the regulatory and governance gaps that exist. We will also question the sanctity of protection of encryption, as encryption morphs from protection of confidentiality of interpersonal communications into a much more open-ended technology than it ever has before.

Didem Özkul: Locational privacy: Key issues, concerns, and approaches

Location data and locational privacy play a significant role in pervasive ubiquitous communication and media systems. While some communication systems rely on the location of the user to deliver relevant media content, empower users’ by providing them with opportunities to participate and co-create maps, or help users solve privacy issues through location-based verification, collection, use, and analysis of location data is still a problematic issue. It is not only problematic due to the uniqueness and difficulties in anonymising location data, but also because of the extent of which location data is used and integrated to algorithmic decision-making and governance practices especially for controlling physical and communicative mobilities. Users’ perspectives including their understanding, concerns, and management of locational privacy play a significant role in informing legal and regulatory frameworks. This lecture will discuss how we can understand locational privacy and what kinds of approaches we can employ for researching location data and locational privacy in highly complex digital environments. It will also focus on how we can work with the users as research partners to tackle the issues around locational privacy with a particular focus on mobile communication technologies.

Marc Langheinrichr: Privacy in the IoT

The vision of the Internet of Things has long been a dystopian one for privacy advocates. Already, today's Web poses seemingly unsurmountable problems when it comes to supporting our privacy, with many "victories" being rather pyrrhic in nature (Cookie notices, anybody?). How will a world featuring sensing and actuating capabilities that practically extend "the Web" literally into our bathrooms and bedrooms affect the task of safeguarding our privacy? In my talk I will briefly outline the origins of the IoT and its core technical attributes, before discussing the privacy challenges inherent in this vision.

Jiahong Chen: Designing the responsible future of smart homes

Smartified domestic spaces are no longer people's private castles. Connected devices have created new channels for interactions across the walls of one's home. In many cases this creates an impact on the power dynamics both among the co-habitants, and between them and external actors. When personal data is involved, questions arise as to who should be responsible for ensuring such data uses are appropriate. Recent research has suggested the design approach - in legal, architectural and interactive terms - can hold great promise. This lecture will explore how the idea of design in these various fields may contribute to the inquiry into how the future smart home can be one that is more responsible when it comes to using personal data.

Gianclaudio Malgieri: Measuring (vulnerability and) risks to fundamental rights: quantifying the unquantifiable

The GDPR introduced the idea of "risks to fundamental rights and freedoms". Despite the lively debate on the interpretation of that concept, no consensus has been reached about what measuring "risks" to "fundamental rights" mean. Some support has now come from the EU Digital Strategy, in particular from the DSA ("adverse impact on the exercise of fundamental rights") and the proposed AI Act ("adverse effect on fundamental rights"). Quantifying risks means quantifying their likelihood, but what is most problematic is the quantification of their severity. Quantification is also key to determining the concept of "vulnerable data subjects", considering that vulnerable subjects are defined as "data subjects at higher risk to their fundamental rights and freedoms". This lecture will explore possible ways to quantify the intensities of effects, harms and risks to fundamental rights, navigating through possible options and proxies.

James Stewart: Negotiating anonymity in a augmented reality university campus – apathy and IoT

Many abstract concepts are introduced in the attempt to understand and government privacy in the the IOT world. In this session the students will be walked through a case study of the introduction university surveillance system - initially using Wifi tracking to help students find free desks in the Library, but later exploited in COVID safety. The case enables us to explore the socio-technical processes by which 'privacy' is negotiated and designed in to a system, where expectations about how and why data might circulate and be given value are formed and discussed, within the context of deciding what sort of place and community a Library and a University is - i.e. the contested construction of the context of data flows. This in turn occurs within the wider context of surveillance, environmental sustainability, public health, data-rich services, law, social relationships, reputation and self-monitoring.

Ana-Maria Cretu: Methods to evaluate the privacy of data processing technologies

Our online and offline activities leave behind digital traces that are monitored and collected on a large scale by businesses and organisations to comply with the law, operate services, and power new applications. Data processing technologies have become ubiquitous in the digital age, yet may contain unknown privacy and security vulnerabilities. In this lecture, I describe how we can evaluate the privacy of data processing technologies by measuring the amount of sensitive information that untrusted parties can learn about individuals from the released data. I illustrate this approach on a range of technologies, including techniques used to anonymise location and interaction datasets, data query systems, and machine learning models.