Programme

The summerschool starts on Sunday with a welcome reception between 19:00 and 20:30. On Tuesday evening there is a social event in downtown Nijmegen and on Wednesday there will be a walk in the woods followed by a rump session. The summerschool ends on Friday after lunch.

The summerschool is comprised of nine lectures of two hours each (with a 15 minute break in between) and time to work on a case study (in groups of 4-6 students). Case study results will be presented on Thursday.

Sunday, September 1

19:00Welcome reception
20:30

Monday, September 2

08:00Breakfast
08:45Opening, Practical Information
09:00Round of Introductions
11:15Coffee/Tea Break
11:30Instruction and Soft Goals
12:45Lunch
13:45Lecture 1: Colin Gray: Exploring Dark Patterns and Conceptions of Designer Responsibility
16:00Coffee/Tea Break
16:15Lecture 2: Michael Dieter: Exit Strategies: Dark Patterns, Interface Critique and the Struggle for Separation
18:30Free Time
18:45Dinner
20:30DETOUR Act and study of dark patters at scale

Tuesday, September 3

08:00Breakfast
09:00Lecture 3: Alan Mislove: Targeted advertising and ad delivery: Privacy threats and opportunities
11:15Coffee/Tea Break
11:30Work on case studies
12:45Lunch
13:45Work on case studies
16:00Coffee/Tea Break
16:15Lecture 4: Gloria González Fuster: Dark Transparency Patterns
18:30Free Time
18:45Convene in lobby for social event
19:30Social event: diner in restaurant 'De Hemel', downtown Nijmegen

Wednesday, September 4

08:00Breakfast
09:00Lecture 5: Nora Draper: Confronting Digital Resignation
11:15Coffee/Tea Break
11:30Work on case studies
12:45Lunch
13:45Work on case studies
16:00Coffee/Tea Break
16:15Lecture 6: Nicolo Zingales: From Soft Nudging to Manipulation: Why Should We Care?
18:30Free Time
18:45Dinner
20:30Walk in the woods
21:30Evening programme: privacy karaoke?

Thursday, September 5

08:00Breakfast
09:00Lecture 7: Moniek Buijzen, Paul Graßl and Hanna Schraffenberger: Shedding Light on Dark Patterns: Ways to Empower Users in Making Deliberate Informed Consent Decisions
11:15Coffee/Tea Break
11:30Work on case studies
12:45Lunch
13:45Lecture 8: Narseo Vallina Rodriguez: Have I given consent to this? Analysing the transparency of mobile products
16:00Coffee/Tea Break
16:15Case study presentations
18:45Dinner BBQ in hotel garden
21:00Rump session

Friday, September 6

08:00Breakfast
09:00Lecture 9: Mario Guglielmetti: Dark patterns, legal aspects: for a possible theory of the 'illecito multi-offensivo' (single 'multiple-offence' act)
11:15Coffee/Tea Break
11:30Evaluation and handout of certificates
12:30Lunch
13:30End of summerschool

Social event

Tuesday September 3 we go out for diner at restaurant 'De Hemel' (Heaven) in downtown Nijmegen.

Address: Franseplaats 1, 6511VS, Nijmegen.

We convene in the lobby of the hotel at 18:45 to go by bus together. Bus tickets will be provided.

Abstracts

Exploring Dark Patterns and Conceptions of Designer Responsibility

Colin Gray

The profession of User Experience (UX) design has rapidly expanded in the past decade, impacting the design of user interfaces, the larger strategic goals of organizations, and ultimately, the relationships of humans and society to technology. While knowledge of user needs and human psychology is generally framed as a means of generating empathy or reducing the divide between humans and technology, this knowledge also has the potential to be used for nefarious purposes. In 2010, scholar and UX practitioner Harry Brignull coined the term “dark patterns” to describe this dark side of UX practice, which I have engaged with over the past three years. In this lecture, I will describe the results of several studies that address practitioner engagement with ethics, focusing particularly on the concept of “dark patterns.” This work spans the generation of a corpus of exemplars, the sharing of examples by practitioners on social media, design students’ engagement with dark patterns in their work, and the mediation of ethical concerns in design practice. I use these studies to build a case for ethical engagement in the education and practice of socio-technical practitioners, pointing towards the need for scholars and educators to address both near-term issues such as manipulation, and longer-term issues such as technology addiction.

Recommended Readings

  • Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018, April). The Dark (Patterns) Side of UX Design. In CHI’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Paper No. 534). New York, NY: ACM Press. https://doi.org/10.1145/3173574.3174108
  • Gray, C. M., & Chivukula, S. S. (2019). Ethical Mediation in UX Practice. In CHI’19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. New York, NY: ACM Press. https://doi.org/10.1145/3290605.3300408
  • Shilton, K. (2013). Values Levers: Building Ethics into Design. Science, Technology & Human Values, 38(3), 374–397. https://doi.org/10.1177/0162243912436985

Exit Strategies: Dark Patterns, Interface Critique and the Struggle for Separation

Dr. Michael Dieter

The concept of dark patterns has been developed to identify and catalogue manipulative practices embedded in user experience design practices. When viewed from a broader perspective of design techniques in conditions of surveillance capitalism (Zuboff, 2018) and platformization, however, the notion opens up a far more ambiguous terrain to the extent that it posits the figure of an 'influenceable user' (Marres, 2018); an ideal type that dovetails with the victimhood discourse of social media addiction and falls notably at odds with the sovereign, autonomous decision-making actor posited in legalistic documentation such as terms and conditions for platforms services and apps (Gehl, 2014). Indeed, while efforts have emerged to mediate this tension through regulatory legislative frameworks like the ePrivacy Directive and GDPR, the contradictions manifest within competing conceptions of the contemporary user remain unresolved; and suggest a multifaceted and evolving matrix of agency that shifts between empowerment, manipulation and control (Stalder, 2018). By way of exploring how these tensions are manifested in contemporary computational infrastructures, my presentation discusses a number of projects I have participated in as experiments with interface critique. These include speculative approaches to dashboards, critical walkthroughs of banking apps, the appropriation and sonification of performance optimization tools and the creation of post-digital disconnection cards. Inspired by recent work in the field of media arts practice and theory (Anderson and Pold, 2018), these are collective critical research projects aimed at wrestling knowledge from systems designed to nudge, while perpetuating forms of 'nonknowledge' regarding their operations (Fuller and Goffey, 2012). Following an overview of these projects, I will conclude with some more theoretical reflections on how performing interface critique today involves a struggle to establish distance from the prefigurations of experience design. In this way, one key challenge would seem to involve recovering modes of collective intent based on radically existential grounds (Neyrat 2017), especially in order to insist on a separation from interfaces in support of critical know-how.

Recommended Readings

  • Singleton, Benedict. '(Notes Toward) Speculative Design,' Shifter Magazine (2015), http://shifter-magazine.com/wp-content/uploads/2015/09/Singleton-Notes-Towards-Speculative-Design.pdf
  • Andersen, Christian Ulrik and Søren Bro Pold. 'Interface Criticism: Why a Theory of the Interface?', in The Metainterface: The Art of Platforms, Cities, and Clouds, Cambridge, MA: MIT Press, 2018, pp. 15-38.
  • Neyrat, Frédéric. 'Elements for an Ecology of Separation: Beyond Ecological Constructivism'. In General Ecology: The New Ecological Paradigm, edited by Erich Hörl and James Burton, Oxford: Bloomsbury, Bloomsbury, 2017, pp. 101-128.

introduction to discipline

  • Wirth, Sabine. 'Between Interactivity, Control, and "Everydayness" - Towards a Theory of User Interfaces', in Florian Hadler and Joachim Haupt (eds) Interface Critique, Berlin: Kulturverlag Kadmos, 2015, pp. 17-38.

Targeted advertising and ad delivery: Privacy threats and opportunities

Alan Mislove

Advertising now funds most popular web sites and internet services: companies including Facebook, Twitter, and Google all provide most of their services for free, in exchange for collecting data from their users. One of the primary explanations for the success of these advertising platforms is that they have leveraged this data to provide the ability for advertisers to target ads to platform users in a myriad of ways. For example, advertisers can now request that their ads be shown to complex combinations of users based on behaviors, demographics, interests, data broker-derived attributes, and even personally identifiable information (PII).

In this talk, I provide an overview of the state-of-the-art in targeted advertising, take a critical look at how these targeted advertising services can be misused, and demonstrate how targeted advertising offers the opportunity to actually increase the transparency of advertising systems. Additionally, I examine the ad delivery implementation that the platforms use to choose which advertiser gets to show their ad to a user — we demonstrate that platforms automatically estimate the relevant of ads to users, which can lead to significant bias and potential discrimination in which users see ads (even if an advertiser targets their ad in a non-discriminatory manner).

Recommended Readings

Overview Article

https://mislove.org/publications/Discrimination-FAT.pdf

Dark Transparency Patterns

Gloria González Fuster

This lecture will look into how transparency obligations imposed on data controllers are (more or less deliberately) twisted in order not to illuminate data subjects but to obfuscate them. It will question how is it possible to presumably comply with General Data Protection Regulation (GDPR)'s information requirements while actually providing to individuals information that is not only insufficient, but actually relatively inaccurate, and ultimately misleading, before consider how and where can one search for better (clearer, stronger) guidance on how to communicate with humans, beyond the boundaries of EU data protection law but in more line with its objectives.

Preparation

  • Art. 29 Working Party, Guidelines on transparency under Regulation 2016/679, Adopted on 29 November 2017 as last Revised and Adopted on 11 April 2018 (endorsed by the European Data Protection Board), https://ec.europa.eu/newsroom/article29/document.cfm?action=display&doc_id=51025
  • Participants are invited to carry out the following preparatory exercise: Please close your eyes. Think of your favourite brand. Open your eyes, and look for a website related to your favourite brand. Find their privacy policy. Read it, paying particular attention to any information you might find on the rights of individuals in relation to data about them.
  • Judgment of the Court (Grand Chamber), 13 May 2014, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, Case C‑131/12, ECLI:EU:C:2014:317, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:62012CJ0131&from=EN

Confronting Digital Resignation

Nora Draper

Numerous news reports have declared 2018 a landmark year for privacy. From scandals including Cambridge Analytica’s use of Facebook data and Marriott’s data breach to newly introduced regulation in Europe (GDPR) and California (CCPA), privacy is a hot topic. Yet, news stories are also replete with the feeling that little can be done to effectively defend against digital privacy and security threats. This talk will explore the sensation of digital resignation: the desire to control our personal data coupled with a feeling of being unable to do so. I will briefly examine the origins of this term and related concepts before considering the role dark patterns play in cultivating this response. I will then turn to some proposed solutions for addressing concerns regarding mounting privacy incursions. After discussing some of the limitations of these efforts, including data ownership initiatives and media literacy campaigns, I end with a discussion of how feminist theory allows us to imagine the possibilities collective approaches may offer for confronting digital resignation.

Recommended Readings

  • Nora A. Draper and Joseph Turow. (2019). The Corporate Cultivation of Digital Resignation. New Media & Society, 1-16, doi: 10.1177/1461444819833331.
  • Valarie Steeves. (2009). “Reclaiming the Social Value of Privacy,” in Ian Kerr, Valerie Steeves and Carole Lucock (eds). Privacy, Identity and Anonymity in a Network World: Lessons from the Identity Trail. New York: Oxford University Press, pp. 191–208.
  • Lina Dencik. (2018). Surveillance Realism and the Politics of Imagination: Is There No Alternative? Krisis, 1. Available at: https://krisis.eu/surveillance-realism-and-the-politics-of-imagination-is-there-no-alternative/

Overview Article

David Lyon. (2014). Surveillance, Snowden, and Big Data: Capacities, consequences, critique. Big Data & Society, 1-13, doi: 10.1177/2053951714541861.

Additional Materials

From Soft Nudging to Manipulation: Why Should We Care?

Nicolo Zingales

Whether we scroll through our feeds on social media, search for products online or rely on other algorithmic services provided by digital intermediaries, we are constantly “nudged” (i.e., steered) in different directions by optimisation strategies and constraints embedded in technological design. Designers determine our “choice architecture”, which gives them by definition an ability to influence our perception of the world. While such privileged position can in principle generate virtuous mechanisms, it can also be exploited to favour the interests of technology providers in ways that are unintelligible to users from the limited information they receive. This creates accountability gaps and triggers harms or risks of harm to both individual and collective autonomy, which is something that our legal system must be able to address from a multidisciplinary perspective.

This session will provide an introduction to the concept of “nudging”, highlighting the various ways in which this term is used and the substantial differences between its forms and manifestations, from benign “de-biasing” and persuasiveness to outright manipulation. We will examine key parameters to evaluate the acceptability of nudging practices, including transparency, firmness, intrusiveness, and the intent of the nudger. We will then review the extent to which these parameters can be useful in assessing the legality of nudging under the applicable framework in data protection, consumer and competition law by reference to a few case-studies.

Overview Article

Calo, Ryan, 'Digital Market Manipulation', 82 Geo. Wash. L. Rev. 995 (2014).

Recommended Readings

Shedding Light on Dark Patterns: Ways to Empower Users in Making Deliberate Informed Consent Decisions

Moniek Buijzen, Paul Graßl and Hanna Schraffenberger

The lecture will consist of three parts, addressing (1) theoretical background, (2) research application, and (3) design application. First, Buijzen will discuss theoretical insights regarding information processing and behavioral change that explain how technology users make decisions about sharing their personal data. The second part of the lecture focusses on the effects of dark patterns in consent statements. Inspired by examples from real-life practice we investigated whether dark patterns in consent statements lead to more privacy-unfriendly decisions as well as to a feeling of lacking control on the user side. In contrast to theories that view privacy decisions as a purely rational process, we highlight the role of psychological biases in privacy decision making and how dark patterns exploit those. In the final part of the lecture, we address the topic of Dark Patterns from an Interaction Design perspective. A “Bright Patterns” design exercise will challenge students to prototype a user interface for online consent statements. The proposed design should not trick users into giving consent, but rather empower users to make their own deliberate decisions - thus putting the theory of boosting into practice

Overview Article

TBA

Recommended Readings

  • A. Acquisti et. al.: Nudges for Privacy and Security: Understanding and Assisting Users’ Choices Online, ACM Computing Surveys, Vol. 50, No. 3, 2017 http://dx.doi.org/10.1145/3054926.
  • Sophie C. Boerman, Sanne Kruikemeier and Frederik J. Zuiderveen Borgesius: Online Behavioral Advertising: A Literature Review and Research Agenda, Journal of Advertising, 46:3, 363-376, DOI: 10.1080/00913367.2017.1339368.
  • Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018, April). The Dark (Patterns) Side of UX Design. In CHI’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Paper No. 534). New York, NY: ACM Press. https://doi.org/10.1145/3173574.3174108

Have I given consent to this? Analysing the transparency of mobile products

Narseo Vallina Rodriguez

Mobile technologies have revolutionised the Internet. Companies offering mobile services and applications often use profitable business models based on mass scale data collection and advertising, both enabled by the capacity to collect rich personal data (e.g., location, contacts) directly from the platform.

However, this data-driven business model has become a societal problem with far-reaching repercussions because of the way that companies use/abuse the data that they gather, often times without having obtained informed user consent. In this talk, we will explore and discuss the origin and nature of various privacy threats faced by users of mobile platforms and applications. Using empirical evidence gathered through the Lumen Privacy Monitor App and the AppCensus auditing platform, we will present and discuss real-wold cases in which applications (and third-party SDKs embedded in mobile apps) leverage side-channels to circumvent the permission controls implemented in mobile platforms to protect personal data, use dark-patterns to encourage users to use digital technologies, the presence of privacy policy inconsistencies, and concerning lack of regulatory compliance, even in software directed at vulnerable populations like minors.

We will conclude with a discussion of the challenges and limitations currently impeding protecting users from privacy abuses.

Overview Article

None.

Recommended Readings

Dark patterns, legal aspects: for a possible theory of the 'illecito multi-offensivo' (single 'multiple-offence' act)

Mario Guglielmetti

Online platforms (owned and managed by so-called Big Tech) often put users into 'rabbit holes' ('compelling behavioural patterns') via 'deception by design'. This may constitute at the same time a breach of data protection principles and rules, of consumer law, and may well be accompanied by competition law relevant aspects (putting users into 'walled gardens', reinforcing economies of scale and scope). The 'rabbit hole' aspect, leading to a loss of informational self-determination, can be observed having regard to different and increasingly complex 'layers':
  • the users' direct interfacing with the online platform;
  • the wider targeted ads ecosystem;
  • the further integration of the ads ecosystem within the communication, financial services, and possibly also public (state surveillance) systems. Profiling (the so-called 'scoring society) is a key aspect in this context of possible loss of informational self-determination.
We may talk about diagnosis, prognosis, possible remedies, and try to address wider open questions ('is there a trade-off between efficiency and fundamental rights (and dignity)? And which kind of 'efficiency' are we referring to? Is self-determination an utopic thought? Should we be 'nudged', given the fact that people may have a limited rationality in their choices individually and collectively? And how (at which conditions and by whom)?)

Recommended Readings

Please refer to this extensive list of background literature, with links to the actual documents.

Overview Article