Welcome to DigiConsumers, a trimonthly exploration of recent courts and decisions about digital consumers in European law & policy. Authored by Catalina Goanta, an Associate Professor of Law specialized in the field, this series go through the latest and most important developments in the space. Enjoy!
****
In the past years, regulators around the world found a new focal point in addressing online information asymmetries. This focal point is dark patterns. As this inspired a widespread interest that brings together human-computer interaction, web measurement, data protection, consumer protection, competition law, and behavioral economics – to name a few relevant disciplines – I decided to focus this consumer update on this topic. With the help of Cristiana Santos, who is an expert in the conceptualization and detection of dark patterns as privacy violations, we have written a short summary of the research in the field, regulatory concerns, as well as brief critical reflections showing where more attention should be paid.
Dark patterns are everywhere. We see them in commercial practices, and now increasingly in scientific scholarship and under public policy scrutiny. Dark patterns are design practices used by online businesses that manipulate consumers into making decisions they otherwise would not have made had they been fully informed and capable of choosing alternatives (Mathur et al., 2019). Examples include spending more money or time on a website or consenting to tracking. In essence, dark patterns modify the consumer choice architecture through a range of possible design attributes that have an impact on the consumer decision space through information flows (or lack thereof).
Information flows causing dark patterns can take many forms. First, the content of the information can be problematic. The European Data Protection Board (EDPB, 2022) defines “content-based dark patterns” by referring to the wording and context of the sentences and information components that can lead users to make unintended, unwilling, and potentially harmful decisions regarding their personal data (e.g., the use of framing, motivational language, humor, etc.). Similarly, the UK Competition and Markets Authority (CMA, 2022) refers to “choice information” practices referring to the information provided to consumers when presenting choices, such as basic details about a product or service (e.g., price, features, dimensions, or ingredients).
Second, the presentation of information can shape dark patterns. For example, default settings can lead to the collection, use, or disclosure of consumers’ information in a way that they did not expect. Other examples are the long and complex privacy policies and terms of service, and delaying access to the information behind hyperlinks, pop-ups, or dropdown menus (FTC, 2022).
Lastly, the lack of information can equally be a constituent of dark patterns. According to the behavioral study of the European Commission on online manipulation (EC, 2022), the practice of “hidden information” is one of the most prevalent forms, and it has an impact on consumers’ transactional decisions. This echoes Luguri & Strahilevitz’s empirical findings that “hidden information” is substantially more effective in influencing consumer choice than the other practices (Luguri & Strahilevitz 2021).
After a wave of investigations emerging in human-computer interaction (Gray et al. 2018, Nouwens et al.) and web measurement studies (Mathur et al 2019), dark patterns have inspired nothing less than a global movement spanning across scientific and policy-making communities to define, classify, measure, and detect dark patterns (e.g., Matte et al, 2020, Utz et al, 2019, Gray et al 2018, Soe et al. 2020, Bösch et al. 2016). In turn, this widespread interest in dark patterns also spawned new theories of consumer vulnerability. In their conceptualization of digital asymmetry, Helberger et al. indicate that while European consumer law has recognized a category of ‘vulnerable consumers’ (e.g., vulnerability based on age or credulity), on digital markets, all consumers may be structurally vulnerable at all times (Helberger et al. 2021). As platforms exclusively rely on screens and information to entice consumers into transactions, they increasingly rely on sophisticated techno-social systems for their information flows to consumers. Design tricks and fraudulent tactics (e.g., fake countdown timers) may echo other generations of unscrupulous commercial practices aimed at deceiving consumers. However, for many of the practices currently captured by the umbrella of dark patterns, the harm is not so straightforward (Gunawan et al., 2021), nor is it evident that consumers make decisions as a direct result of an interface design option, or rather in spite of it (CMA, 2022).
While policymakers are pursuing the formalization of dark patterns in law and policy at full speed (e.g., Art. 23a DSA, Art. 13(6) DMA), critical inquiries relating to the limitations of conceptual, measurement and detection frameworks are generally lacking. By using too low of a threshold for categorizing an information flow as a dark pattern, we risk the watering down of the concept itself: if everything is a dark pattern, then nothing is a dark pattern.
Contract law and consumer protection are rich in theoretical frameworks that translate information flows between consumers and businesses, sometimes establishing the boundaries of such flows: Who is supposed to actively retrieve information in a contract? Who is supposed to investigate? How do vitiating factors or unconscionability affect how information flows are considered harmful to the more vulnerable contracting parties? (Murray, 1969; Schneider & Ben-Shahar, 2014). On these matters, different legal systems have developed doctrines regarding the information paradigm leading to different boundaries that have been applied across a wide variety of commercial sectors. In digital markets, however, the quantification of consumer choice and behavior has created a strong rift between theory and practice, best described by the two policy extremes in the United States and European Union. Neither the under-accountable buyer beware stance characterizing the first, nor the over-accountable mandated disclosures pursued by the latter captures the sophisticated shape of information flows in digital markets.
Against this background, several conceptual frameworks capture the information paradigm central to the dark patterns debate, at the intersection of several tensions:
- Under- and overrepresenting information as consumer harm amounting to dark patterns;
- Mandating more or less information to balance the digital asymmetry between companies and consumers;
- Taking away or giving more consumer autonomy.
While the dark patterns phenomenon has led to more and more definitions and taxonomies describing dark patterns from an information perspective, further harmonization is necessary. What are the most well-known dark patterns taxonomies and how do they address the difficulty of weaker parties obtaining (correct) information, including to what degree digital literacy is necessary for this process? Such further harmonization can contribute to literature about consumer autonomy, not only with respect to transactional information but also legal information (e.g., privacy rights). In addition, we consider it important that dark patterns are addressed in the wider discussion around the role of (mandated) disclosures in the context of the information paradigm. How do these paradigms fare in different jurisdictions around the world (e.g. US and EU legal scholarship on consumer and privacy disclosures)? Dark patterns are a function of information. Legal systems often rely on disclosure as a means of alleviating information imbalance. In this case, however, it is not entirely certain that more information can be better, since the entire idea of using digital environments for decision-making is to optimize the use of information, but it is not clear what information is optimal, how, and for whom. Lastly, we think it is necessary to call upon more normative questions to be answered, such as what the participation of traders and consumers in the information paradigm should look like. Can there be multiple information paradigms that could be relevant for further regulatory intervention in this space, and if so, which ones? What is the role of factors such as consumer autonomy and reliance on digital infrastructures in these possible paradigms? More importantly, what role should behavioral science play in these normativities? While promising, our understanding of social psychology, seen in the light of non-generalizable behavioral findings and serious reproducibility issues in the field, is far from ideal. Is current behavioral science helping us or complicating our expectations from regulation even further? These are some important points that we hope research (whether academic or regulatory) can tackle in the short future.
Catalina Goanta & Cristiana Santos
***
Citation: Catalina Goanta and Cristiana Santos, Dark Patterns Everything: An Update on a Regulatory Global Movement, Network Law Review, Winter 2023. |
References
Bösch et al. 2016
Christoph Bösch, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher, Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns, In Proceedings on Privacy Enhancing Technologies 2016, 4n 237–254 (2016). https://doi.org/10.1515/popets-2016-0038
CMA 2022
Competition Market Authority (CMA), Online Choice Architecture: How digital design can harm competition and consumers, Discussion Paper (2022). https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1066524/Online_choice_architecture_discussion_paper.pdf
EDPB 2022
European Data Protection Board, Guidelines 3/2022 on Dark patterns in social media platform interfaces: How to recognise and avoid them (2022). https://edpb.europa.eu/our-work-tools/documents/public-consultations/2022/guidelines-32022-dark-patterns-social-media_en
European Commission 2022
European Commission, Directorate-General for Justice and Consumers, Behavioural study on unfair commercial practices in the digital environment: dark patterns and manipulative personalisation: final report, Publications Office of the European Union (2022).https://data.europa.eu/doi/10.2838/859030
FTC 2022
Federal Trade Commission, Staff Report, Bringing Dark Patterns to Light, (2022). https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf
Gray et al. 2018
Colin M Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L Toombs, The Dark (Patterns) Side of UX Design, In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). dl.acm.org, 534:1–534:14 (2018). https://doi.org/10.1145/3173574.3174108
Gunawan et al. 2021
Johanna Gunawan, David Choffnes, Hartzog Woodrow, and Christo Wilson, Towards an Understanding of Dark Pattern Privacy Harms, What Can CHI Do About Dark Patterns? CHI WORKSHOP 2021 (2021). https://darkpatterns.ccs.neu.edu/pdf/gunawan-2021-chiworkshop.pdf.
Helberger et al. 2021
Natali Helberger, Orla Lynskey, Hans-W. Micklitz, Peter Rott, Marijn Sax, and Joanna Strycharz, EU Consumer Protection 2.0, Structural asymmetries in digital consumer markets, study for BEUC (2021). https://www.beuc.eu/publications/beuc-x-2021-018_eu_consumer_protection.0_0.pdf
Luguri & Strahilevitz 2021
Jamie Luguri and Lior Jacob Strahilevitz, Shining a Light on Dark Patterns, Journal of Legal Analysis, Volume 13, Issue 1, Pages 43–109 (2021) https://doi.org/10.1093/jla/laaa006
Matte et al. 2020
Célestin Matte, Cristiana Santos, and Nataliia Bielova. Do Cookie Banners Respect my Choice? Measuring Legal Compliance of Banners from IAB Europe’s Transparency and Consent Framework, In IEEE Symposium on Security and Privacy (IEEE S&P 2020). 791–809 (2020). https://doi.org/10.1109/SP40000.2020.00076
Mathur et al. 2019
Arunesh Mathur, Gunes Acar, Michael J. Friedman, Eli Lucherini, Jonathan Mayer, Marshini Chetty, and Arvind Narayanan, Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites, Proc. ACM Hum. Comput. Interact. 3, CSCW, Article 81, 32 pages (2019). https://doi.org/10.1145/3359183
Mathur et al. 2021
Arunesh Mathur, Jonathan Mayer and Mihir Kshirsagar, What Makes a Dark Pattern… Dark? Design Attributes, Normative Considerations, and Measurement Methods”, In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 360, 1–18 (2021). https://doi.org/10.1145/3411764.3445610
Murray 1969
John E. Murray Jr., Unconscionability: Unconscionability, (1969) 31 U. PITT. L. REV. 1.
Nouwens et al. 2020
Midas Nouwens, Ilaria Liccardi, Michael Veale, David Karger, and Lalana Kagal. Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence, In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, 1–13 (2020). https://doi.org/10.1145/3313831.3376321
Schneider & Ben-Shahar 2014
Carl Schneider & Omri Ben-Shahar, MORE THAN YOU WANTED TO KNOW (2014).
Soe et al.
Than H Soe, Oda E Nordberg, Frode Guribye, and Marija Slavkovik, Circumvention by design-dark patterns in cookie consent for online news outlets, In NordiCHI ’20: Proceedings of the 11th Nordic Conference on HumanComputer Interaction: Shaping Experiences, Shaping Society (2020). https://doi.org/10.1145/3419249.3420132
Utz et al.
Christine Utz, Martin Degeling, Sascha Fahl, Florian Schaub, and Thorsten Holz, (Un)informed Consent: Studying GDPR Consent Notices in the Field, In Conference on Computer and Communications Security, ACM, 973–99 (2019). https://doi.org/10.1145/3319535.3354212