Privacy Dark Patterns: A Case for Regulatory Reform in Canada

  • September 08, 2022

by Ademola Adeyoju, winner of the 2022 Privacy and Access Law Section Student Essay Contest

Abstract

By 2023, nearly 90 percent of the Canadian population or about 35 million people will have access to the internet. Increased access to the internet also means that more people are having to constantly share their personal information (i.e., location data, IP address, financial and health data) with companies all the time. To help people maintain control over how, when, and to whom they disclose their personal information – and govern how companies collect, use, and disclose personal information – a number federal and provincial privacy laws exist across Canada.

However, due to lack of technical and procedural clarity in current laws and the inherent weaknesses of the knowledge-and-consent architecture upon which these laws are built, companies are finding novel ways to undermine the central objectives of the laws and prevent people from making informed decisions by subverting their autonomy through privacy dark patterns – privacy dark patterns beingproblematic design patterns or elements on website and mobile applications that trick, manipulate, or coerce people into giving away personal information they would rather keep.

Although most people do not realize it, but we encounter privacy dark patterns on the web every day: in fact, they have become so common and so merged into our online experience that we have become oblivious to them – which is not startling, since dark patterns are designed to work in the background, without intrusion, while they continue to manipulate or deceive. And they continue to proliferate still.

As they enable the invasion of privacy in subtle but unprecedented ways, privacy dark patterns have stimulated serious conversations in recent years in jurisdictions such as Europe and the United States, and led to the adoption of new legislative measures. Despite the move to shield the fundamental essence of privacy laws from being violated by dark patterns in Europe and the United States, nothing seems to be happening in Canada yet, even as privacy concerns among Canadians grow. In fact, as of the time of writing this paper, there does not seem to be any publicly available academic, legal, or policy document specifically addressing the issue of privacy dark patterns in relation to how it affect Canadians’ right to privacy. At the very least, therefore, this paper hopes to: (x) raise awareness on the relatively novel threats that dark patterns pose to privacy by taking an interdisciplinary approach that combines insights from behavioural economy, product design, and law; and (y) propose policy and regulatory recommendations that can help combat the proliferation and use of privacy dark patterns in Canada.

This paper is written in 3 parts. Part 1 of this paper introduces the concepts of patterns, anti-patterns, and dark patterns. Zeroing in on dark patterns, this part then examines: (x) the issue of what makes dark patterns dark or problematic; (y) the concept of choice architecture and why dark patterns are so effective at tricking, manipulating, or coercing people; and (z) the threats that a particular species of dark patterns – i.e., privacy dark patterns – pose to information privacy. Part 2 of this paper analyzes the current approaches to regulating privacy dark patterns in Europe and the United States, two key jurisdiction addressing privacy dark patterns in distinct, but interesting, ways. The analysis here builds the momentum for part 3 of this paper where I consider the major issues with the current privacy regime in Canada and propose possible policy and regulatory approaches to regulating privacy dark patterns in Canada.

This is an explanatory paper, a comparative paper, a policy-oriented paper, and then a normative paper.

PART 1

I begin this part by introducing the concepts of patterns, anti-patterns, and dark patterns. This introduction is essential as it provides the broad context for the core issues discussed in the paper. Then I zero in on dark patterns and examine: (x) the issue of what makes dark patterns dark or problematic; (y) the concept of choice architecture and why dark patterns are so effective at tricking, manipulating, or coercing people; and (z) the threats that a particular species of dark patterns – i.e., privacy dark patterns – pose to information privacy. This last consideration lays the foundation for Part 2 of this paper, where I discuss how privacy dark patterns are currently being regulated in Europe and the United States.

Understanding Patterns, AntiPatterns, and Dark Patterns

Patterns are reusable solutions to recurring problems. First used in the common-problem/common-solution context by Christopher Alexander et al. in “A Pattern Language: Towns, Buildings, and Constructions”, patterns are a blueprint, a high-level description of a solution that can be implemented however the implementer chooses – “each pattern describes a problem which occurs over and over again… and then describes the core of the solution to that problem, in such a way that [one] can use this solution a million times over, without doing it the same way twice”.1

Borrowing Alexander’s conception of ‘patterns’, Erich Gamma and colleagues, now famously known as the Gang of Four, introduced patterns to the world of computer science in 1994 in their seminal work: “Design Patterns: Elements of Reusable Object-Oriented Software”. 2 As in architecture, patterns, the authors argue, can also be used “in terms of [object-oriented designs3] and interfaces”.4 In their work, the Gang of Four structures patterns into four elements: a name, which enable thinking and talking about design in a new language, a problem, which describes when to apply the pattern, a solution, which describes a general template of solving the problem, and consequences, that is, the result and trade-offs of applying the pattern.5

The use and proliferation of patterns in software design and development would ultimately give rise to a different class of common designs problems: antipatterns. Coined by computer programmer Andrew Koenig6 in 1995 and expounded upon by William Brown et al. in their book on “AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis,” an antipattern is a “commonly occurring solution to a problem that generates decidedly negative consequences”.7 In a manner of speaking, therefore, antipatterns are the opposites of patterns – thus, while patterns are a collection of good solutions, antipatterns8 are a collection of “common defective processes and implementations… ”9 usually caused by “laziness, mistakes, or school-boy errors”.10 To recapitulate, while patterns collect the Dos for good intentions, “antipatterns collect the Don’ts for good intentions”.11

But there is a third, more sinister class of patterns that “collect… [the] Dos for malicious intents.”12 This third class, now generally known as dark patterns,13 implicate both user experience (UX) and user interface (UI) designs. For clarity, UX design focuses on all aspects of a user’s smooth, intuitive, and logical interaction with and understanding of a system, product, or service while UI design is all about the aesthetics (button shapes, colors, font type, etc), responsiveness (including optimization for different screen sizes), and general efficiency.

As a class of patterns, dark patterns14 have been defined variously by academic researchers as design choices, which: are “[used] in digital interface [and] intended to elicit certain behaviors from users”;15 are deliberately deployed by designers abusing their knowledge of human behavior;16 “influence a person’s behavior against their intentions or best interests”;17 manipulate or subvert users’ intent;18 “make it difficult for users to express their actual preferences, or manipulate users into taking certain actions”;19 “[coerce, steer, or deceive] users into making unintended and potentially harmful decisions”;20 and trick users into doing things that they did not mean to do.21 Though different in their formulations, these definitions clearly paint dark patterns as design practices that are, at best, manipulative or deceptive and, at worst, coercive.

An average internet user encounters dark patterns on the web everyday:22 they “manifest in several different locations [on the web and in mobile applications]… and they can rely heavily upon interface manipulation, such as changing the hierarchy of interface elements or prioritizing certain options over others using different color”.23 You have experienced dark patterns in situations where an ecommerce platform manipulates your purchasing decision choice (e.g., by adding an unwanted item to your cart or by calculating hidden fees) so you end up paying more than you intended; where you have come across ridiculously long and complex terms of use or dense disclaimers; where days after using a navigation service, you realize that your location is still “on” and the service provider has been tracking every single step you have taken; where a tick, which traditionally and representationally means a “yes”, is used in a context in which it actually means a “no”; or where a cookie banner prominently presents the “accept” button on a cookie slide, but hides the “reject” options and makes it a complicated process to “disable” certain cookies (including, for instance, those that are not strictly necessary).

Indeed, dark patterns have become so common and so merged into our digital experience that we have become oblivious to them or simply fail to spot them: this is not surprising, since they are designed to work in the background, without intrusion, while they continue to manipulate or deceive us.24 And they continue to proliferate still. In terms of their motivation, companies deploy dark patterns for various reasons – from the ambition to gain or retain market competitiveness, to the desire to harvest users’ attention, and the determination to exploit regulatory loopholes.

Depending on how they are implemented, dark patterns can influence specific choices, nudge users towards particular behaviors, deceive or confuse users through carefully crafted misstatements, misleading statements or omissions, and, in extreme cases, even coerce users by offering a limited set of acceptable options. Some examples have been provided above. Yet, for all that has been said about dark patterns, identifying them “in the wild” can still be challenging: i.e., despite a general understanding of the phenomenon, it can be tricky to determine with any degree of consensus what design elements in digital products or services constitute dark patterns. In other words, the question of what qualifies a pattern as “dark” cannot be answered in any simple way, and one’s answer would vary depending on one’s perspective or end goal. Indeed, some researchers have suggested that this lack of baseline certainty “is one possible reason why the law has been slow to hold app and website designers accountable for dark patterns…”25

By way of addressing this uncertainty, some useful suggestions have been offered. Luguri and Strahilevitz argue that one element common to dark patterns is that they are designed to “manipulate consumers by altering online choice architecture in ways that are designed to thwart users’ preferences for objectionable ends.”26 In their conception of dark patterns, therefore, Luguri and Strahilevitz generally insist on the designer’s intent as being relevant.27 Acknowledging that “[u]nderstanding designers’ intentions and ethical decisions is subjective and may lead to imprecision…”,28 Linda Di Geronimo et al. suggest that “on every occasion in which an interface seemed to benefit the app rather than the user”, [such design interface should be classified as a dark pattern]. For instance, if an app asks for location permissions and the UI [design] seems to prefer the “accept” option, [this should be considered] a malicious design… even though the designers may have intended this feature to [simply] speed up the interaction process”.29 By going this direction, Linda Di Geronimo et al. focus more on the functional or effectual30, as opposed to the subjective, component of dark patterns. Yet others have argued that intent and function/effect should be combined when assessing when a design pattern is dark. Thus, in establishing a threshold, Rinehart et al. argue that “a dark pattern must [show] (1) clear nefarious intent to mislead reasonable consumers; and (2) reasonable likelihood of achieving some harm to a reasonable consumer at the benefit of the bad actor”.31

Going further than setting a definitional basis for dark patterns, Mathur et al. offer normative approaches that are quite useful in determining when dark patterns do indeed become problematic and require attention.32 Their overall conception is based on the notion that regulators, researchers, and policy makers cannot possibly concern themselves with all manifestations of dark patterns in the wild. According to the authors, therefore, a dark pattern becomes problematic and deserves attention when it diminishes individual autonomy, individual welfare, collective welfare, or regulatory objectives. Individual autonomy is impacted where design patterns influence or nudge a user into making a decision they would not have otherwise made if the options were presented differently. Individual welfare is diminished when design patterns negatively affect a user’s ability to make independent decisions, such that the platform provider is benefited at the expense of the user. Collective welfare is affected when a design pattern influences individual user’s agency in a manner that results in cognitive burden, financial loss, and invasion of privacy for the user but enables the platform provider to “extract wealth and build market power without doing so on the merits”.33 Regulatory objectives are weakened when design patterns interfere with regulatory objectives (for instance, by enabling location tracking by default or making it practically impossible to delete an account) or otherwise achieve the effect of exploiting the lack of technical and procedural clarity in implementing specific regulatory requirements, e.g., those around consent.34

While the above categorizations are not entirely objective, they are especially useful in the context of this paper, as subsequent discourse on dark patterns shall be limited to those that impact individual autonomy and undermine regulatory objectives: this is so partly because these metrics are the more easily measurable ones and partly because they are so interconnected – taking away individual autonomy automatically qualifies as undermining the objectives of relevant privacy laws (which main essence is to protect people’s control over their personal information by balancing privacy and commercial/business interests) and by undermining regulatory objectives, individual’s autonomy or agency to decide how, when, and to what extent they want their personal information to be used is naturally affected and the protection afforded by the law is taken away.

To reiterate, in measuring whether a dark pattern is problematic in the context of this paper and for the purpose of making or implementing relevant laws or policies, the twin thresholds to be considered should be whether or not the dark pattern in question detracts from, subverts, or takes away a user’s autonomous agency and undermines the objectives of privacy regulations. As I argue in parts 2 and 3 of this paper, these individual autonomy and regulatory objective metrics should be universally applied, without much regard to the actual intention of the designers.35

Choice Architecture and Why Dark Patterns Work

To understand why dark patterns work and how they can be so effective at tricking, manipulating, or even coercing the most savvy of digital platforms users, this section borrows insights from behavioural research and psychology, as these fields have produced, within the past five decades, research findings that essentially prove that we, humans are, by nature, irrational beings;36 that we are not “flawless assessors of scientific and probabilistic judgments… [and that we] display a startling ineptitude for comprehending causality and probability”.37 Not only are we irrational; studies have shown that our irrationality is predictable too,38 as we make systematic mistakes influenced by emotions, social norms, and several other invisible forces.

Apart from being predictably irrational, we are also “somewhat mindless, passive decision makers”.39 Thus, for instance, when confronted with situations where things could go on without our intervention, we often choose the default, whether or not it is good for us. And in instances where we have to or choose to do something, our behavior is susceptible to both external influences and internal biases.40 We are, thus, highly nudge-able, to the extent that “… even in life’s most important decisions, [our choices] are influenced in ways that would not be anticipated in a standard economic framework”.41

Dark patterns are effective because they capitalize on our irrationality, our passivity, and our biases through choice architecture. Choice architecture is the science of building everything. In our current context, it is the art of shaping the environment or setting in which people make decisions. Since the choices that people make have been shown to change depending on the way options are presented to them, choice architecture – especially when combined with insights from behavioral research – is, therefore, a powerful tool to nudge, influence, manipulate, or coerce people into making certain decisions or behaving in highly predictable ways. In the physical world, IKEA is probably a good example of a company that employs choice architecture to manipulate customers by getting them “hooked with everything from psychological tricks to the layout of the store… ”.42 In the digital world, much more manipulation can be done much more easily: simply, there are boundless possibilities, and digital platforms providers can make people do what they want by simply tweaking the substance, presentation, and range of available options on the digital platforms – after all, platforms designs are defined by visual cues, rules, and logic that naturally determine the range of maneuverability or the boundaries of possibilities available to any user. As the Critical Engineer Manifesto states: “[t]he Critical Engineer recognises that each work of engineering engineers its user, proportional to that user's dependency upon it”.43

Choice architecture becomes an enabler of dark patterns when, instead of catering to people’s preferences, it is used to cause users to engage in conducts or make choices that are against their best interest.44 Interestingly, even those UI/UX designers who are not aware of the power of choice architecture end up influencing people anyway, and so there is no such thing as neutral choice architecture. Consider an amusing scenario in Thaler and Sunstein’s “Nudge”, where due to bad architecture – or antipatterns – Thaler and his students find themselves pulling instinctively on large wooden handles on a lecture hall’s door as they exit the hall, because door handles are designed to be pulled and flat plates are designed to be pushed. It turns out that the door actually opens outward and the only way to exit is to push the handles!45

Given that companies invest billions of dollars in artificial intelligence and randomized experiments, such as A/B testing,46 to understand the nuances of the human behavior and ways to leverage that understanding in improving the choice architecture of their platforms, it is probably safe to conclude that these companies know that there is no such thing as a “choice-less” architecture. They know that we humans “do not order our preferences according to simple maximization axioms, but rather our preferences are shaped at least in part by the manner in which they are elicited,”47 and that different presentation of options would usually lead to different choices or outcomes.48 With this understanding, companies tweak their choice architecture all the time, for example, by testing which of different versions of a product implementation works best on users.49

Inevitably, these companies exploit the power of choice architecture and deploy dark pattern techniques because they have commercial incentives.50 In other words, when faced with the choice to exploit human’s cognitive weaknesses through choice architecture or act ethically, data-driven, profit-maximizing companies almost always choose to do the former. And why not? After all, “for all their virtues, markets often give companies a strong incentive to cater to (and profit from) human frailties, rather than to try to eradicate them or to minimize their effects”.51

On Privacy Dark Patterns and Information Privacy

Privacy is an innate human desire and a feature of existence that anthropological and historical evidence suggest has always been with us.52 Even the medieval society, with their relatively “open” way of life, desired privacy.53 Apart from being the foundation of modern democracy,54 privacy is essential to human dignity, autonomy, and freedom, and is considered a prerequisite for the enjoyment of other civil liberties, protection of reputation, and maintenance of social boundaries.

Privacy is classified differently in different contexts and jurisdictions. In Canada, it is classified in some quarter into information, territorial, and personal privacy.55 In this paper, I am only concerned with information privacy, i.e., “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others,”56 which has, within the past couple of decades, become perhaps the most discussed of all privacy classes, and which status has been elevated to a historically high level on the global stage and spurred the enactment of dedicated information privacy or data protection legislation across the world. This rise in interests in information privacy is not surprising considering that the ongoing digital revolution is fuelled by data.57

Since it was first (re)defined by Alan Westin in 1970 – in response to raging computer and privacy issues in the United States, which reached its peak around 196558 – information privacy has faced serious threats, mostly due to advancements in technology. In the case of dark patterns, information privacy faces new and unprecedented because technology enables the “ease of deployment, the speed, the scale, the precision [and] the control of variables”59 that make dark patterns so effective and so harmful. Through privacy dark patterns,60 “we increasingly see companies… manipulate people into giving up their personal data…”61 To be sure, even the biggest companies in the world use privacy dark patterns62 – often in violation of information privacy regulations – to nudge, deceive, or coerce people into giving away as much personal information as possible from people through “privacy intrusive default, settings, misleading wording, giving users an illusion of control, hiding away privacy-friendly choices, take-it-or-leave-it choices, and choice architectures where choosing the privacy friendly option requires more effort for the users”.63

This is clearly not ideal, and given that privacy dark patterns meet both the individual autonomy and regulatory objective metric discussed above, regulators all over the world need to pay keen attention, lest the regulatory goal to protect people’s information privacy continue to be defeated through what is essentially a digital sleight of hand.

Put differently, privacy dark patterns are problematic and deserve attention because they damage privacy and other related fundamental values through design. They “… [create] a situation where [users] are in an increasingly vulnerable position… [and] have significant impact on the [users’] wellbeing, as the single decisions adopted by the controllers are not only affecting [users’] rights and freedoms but also their satisfaction”.64

Not only do privacy dark patterns put users in a vulnerable position, the seemingly insatiable thirst to “collect it all” also potentially has serious (unintended) consequences – for example, “collective harms [may] arise from the collection, aggregation, and use of data [which can lead to] predictions and inferences that can be made about a zip code or an area code”.65 This may, in turn, lead to discrimination,66 inequalities, and “more substantive outcomes on [marginalized, under-privileged, or minority communities’] well-being or access to opportunities”.67

Furthermore, privacy dark patterns erode people’s trust in digital platforms providers and hurt commerce. It also enables the tendency to make people serve technology for the profits of companies. In the words of the former European Data Protection Supervisor:68

[Privacy dark patterns] is a big problem, which we must not underestimate. I am very worried that dark patterns become an established way of circumventing people’s privacy choices. I am also worried, because there is only a small gap between nudging and recklessly taking advantage of natural human traits. I fear that by employing dark patterns companies foster the view that humans are not autonomous individuals but rather function in technical determined ways. When individuals are not treated as persons but only as mere aggregates of data which can be processed on an industrial scale for the profit of some companies, they are clearly not fully respected in their dignity. Putting forward such an image of humanity is not only questioning our most fundamental social values but also lessens people’s trust in digital services.69

Lastly but perhaps most importantly, privacy dark patterns also systematically weaken the very foundation of information privacy laws by undermining fair information principles, including transparency, accountability, and limiting data collection and use – but especially consent, the principle upon which many countries’ privacy regimes are built – and consequently, violate people’s fundamental rights. As only a few legislative attempts have been made to specifically address the threats they pose to privacy, privacy dark patterns are still technically legal in most parts of the world, considering the argument they only really offend the spirit, not the letter, of most legislation. Thus, if left unchecked through clear, explicit prohibitions, current circumstances suggest that competition and market forces would continue to drive more companies to deploy privacy dark patterns, even as they (i.e., dark patterns techniques) become more sophisticated.70 Consequently, more people may lose control over their information, an already endangered fundamental right may erode faster, and we may be heading towards a dystopian future where surveillance capitalism becomes the way of life.

PART 2

In part 1, I explored the concept of dark patterns, the notion of choice architecture,71 and what makes dark patterns problematic particularly in the context of this paper – i.e., the subversion of user autonomy and the undermining of regulatory objectives. I also noted that privacy dark patterns (the particular species of dark patterns with which we are concerned in this paper) are a powerful tool deployed by digital services providers that cause people to give away personal information they would rather keep and generally work to weaken an already endangered right to privacy in subtle but serious ways.

In this part, I analyze the current approaches to regulating privacy dark patterns in Europe and the United States, two key jurisdictions addressing privacy dark patterns in distinct ways. The analysis here builds the momentum for part 3 of this paper where I consider possible policy and regulatory approaches to regulating privacy dark patterns in Canada. To facilitate the discussion, I will now briefly examine notable privacy dark patterns taxonomies developed so far – these taxonomies are especially useful for privacy scholars, policy makers, and the general public because they constitute a common language aiding the understanding of and ability to speak about privacy dark patterns in much the same way that Christopher’s and the Gang of Four’s conceptions of patterns make it possible, even efficient, to speak about complicated frameworks in architecture and software design respectively.

Privacy Dark Patterns Taxonomies

Since Harry Brignull first came up with rough categorizations of dark patterns on his website in 201072, researchers have developed taxonomies that are more systematic, in that they focus on different broad categories of dark patterns. Clearly, we are only concerned here with privacy dark patterns taxonomies.73 That said, in 2016, Bösch et al. identified six taxonomies “that take into account traditional privacy patterns, empirical evidence of malign patterns, underlying malicious strategies, and their psychological background”.74 These taxonomies include, but are not limited to, Privacy Zuckering (which involves unnecessary complex and incomprehensible privacy settings);75 Bad Defaults (which entails implementing bad default options so that users are forced to share more information than they would otherwise have);76 Address Book Leeching (which entails a service provider asking users to upload their “address books to connect with known contacts on that service”);77 and Immortal Accounts (which involves making it complicated or almost impossible to delete an account and associated data).78

In 2017, Lothar Fritsch79 identified three privacy dark patterns in the context of identity management, which “shows how security is used as obfuscation of data collection, how the seemingly harmless collection of additional data is advertised to end users, and how the use of anonymization technology is actively discouraged by service providers”.80 Fritsch’s taxonomies comprise: Fogging Identification with Security (which entails baiting users into divulging personal information in exchange for more security);81 Sweet Seduction (which involves asking users for more and often unnecessary information with an accompanying promise of confidentiality or secrecy of that information);82 and You Can Run But You Can’t Hide (which involves platforms that refuse access, usually on the basis of security, to users who would rather remain anonymous).83

Then in 2018, the Norwegian Consumer Council (“NCC”), an agency of the government of Norway working to improve consumer satisfaction and generally strengthen consumers’ position, released a report titled: Every Step You Take: How Deceptive Design Lets Google Track Users 24/7, which examined certain privacy dark patterns deployed in Google services, and found them to be “unethical” and “in breach of European data protection legislation…”84 In this report, the NCC identified six privacy dark patterns: Hidden Default Settings (which entails setting bad default settings and hiding those settings from users);85 Misleading and Unbalanced Information (which involves trivializing how users’ data will be used and the extent of tracking going on);86 Deceptive Click Flow (which entails deceiving users into unwittingly enabling certain data collection);87 Repeated Nudging (which involves repeated asks for users to give access to certain data collection);88 Bundling of Services and Lack of Granular Choices (which entails integrating separate functionalities and affording no opportunity for users to make specific choices);89 and Permissions and Always-On Settings (which involves the almost ceaseless collection of personal information through privacy settings enabled by users even when users are not using the relevant services).90

In the same year, the NCC released another report titled: Deceived by Design, in which they identified five other categories of dark patterns, i.e.,: Default Settings (which capitalizes on users’ inertia and entails the presetting of bad privacy options that ultimately lead users into giving away their information without even knowing it);91 Ease (which is a related taxonomy that describes a situation where digital platforms providers make it easy to give away their information and extremely complicated to limit information sharing);92 Framing (which employs insights from linguistics and psychology to focus on the positive aspects of one choice, while glossing over any potentially negative aspects, and which is intended to incline many users to comply with the service provider’s wishes);93 Rewards and Punishment (a privacy dark pattern technique used to nudge users into accepting the collection of information based on the threat of users losing certain functionalities if they decline or opt out of information sharing);94 and, lastly, Forced Action and Timing (which entails putting “pressure on the user to complete [certain] settings review at a time determined by the service provider, without a clear option to postpone the process”).95

And then in 2019, the French Data Protection Agency, the Commission Nationale Informatique & LibertĂ©s, developed a taxonomy of potentially deceptive design practices. The taxonomy comprises a staggering 18 distinct categories of dark patterns, which include: Obfuscating Consent (which entails [c]reating a deliberately long and tedious process to achieve the finest settings or make them so fine and complicated that they will encourage the user to give up before reaching their initial target”);96 Trick Question (which entails misleading users to pick the wrong option through, for example, the use of double negatives);97 Impenetrable Wall (which involves blocking users from accessing a service unless they consent to: (x) creating an account; or (y) tracking by cookies);98 Repetitive Incentive (which entails inserting incentives on data sharing requests);99 and Default Sharing (which involves pre-checking information sharing options).100

It is important to remember the names of the privacy dark patterns taxonomies considered above, as they will be used throughout the rest of this paper. Also, it should be noted that these taxonomies are not exhaustive: the taxonomies continue to grow as researchers continue to survey the web and scrutinize problematic design patterns – growth that serves as evidence of increasing proliferation of dark patterns in the wild.

Regulating Privacy Dark Patterns – the Case in Europe and the United States

There are currently very few privacy legislation anywhere in the world that specifically regulate privacy dark patterns.101 I say “specifically” because one can find provisions in many privacy laws that prohibit – even if only indirectly – the subversion of people’s autonomous agency and the obtaining of personal information not through real choice but through manipulation, deceit, or coercion. This implicit prohibition of privacy dark patterns characterizes the current state of privacy laws in the European Union (“EU”) (and most of Africa – where many information privacy regimes draw inspiration from the Data Protection Directive (Directive 95/46/EC) and/or the European Union General Data Protection Regulation). The problem with implicit prohibitions, of course, is that they require further explanations or figuring out, without which explanations implicit prohibitions are most effective where regulated entities opt to follow both the spirit and the letter of the law. Interestingly, one jurisdiction in particular – i.e., the United States (“US”) – has gone further to enact legislation specifically prohibiting privacy dark patterns.

To reiterate, privacy dark patterns are currently being regulated in two ways – explicitly through express prohibitions that forbid relevant entities from tricking people into giving away their personal information (as in the US) or implicitly through prescriptive provisions that cumulatively serve to ensure individual autonomy, informed and meaningful choice, and privacy by design (as in the EU). As will become obvious following the comparative outlook on the EU and US approaches, Canadian privacy laws are, in their current state, outdated and outmoded, thus, locking Canadians’ privacy interests in a rather compromised zone.

The Current EU Approach

In Europe, privacy dark patterns are prohibited, albeit implicitly, under the General Data Protection Regulation (“GDPR” or “Regulation”)102 through expansively crafted, technologically neutral provisions, which, as noted above, are designed to preserve individual autonomy, ensure meaningful choice, and facilitate privacy by design. Bearing in mind that the core objective of most privacy dark patterns is to undermine the essence of the notice-and-choice architecture,103 it is interesting to note that the GDPR places consent at the heart of data processing104 and stipulates key, interlocking conditions for obtaining valid consent.105

Thus, when relying on consent, entities subject to the GDPR are required by Article 7 and Recitals 32, 42, and 43 of the Regulation106 to ensure that consent is procured lawfully, i.e., “by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her”.107 In their Guidelines on Consent, the European Data Protection Board (“EDPB”) – the EU body that contributes to the consistent application of the GDPR – has added to that by emphasizing that: “… consent can only be an appropriate lawful basis if a data subject is offered control and is offered a genuine choice with regard to accepting or declining the terms offered or declining them108… If not, the data subject’s control becomes illusory and consent will be an invalid basis for processing, rendering the processing activity unlawful”.109

As the following paragraphs show, there can be no way to satisfy the above GDPR requirements whilst accommodating the use of privacy dark patterns at the same time. Put differently, the requirements on consent and choice under the GDPR implicitly forbid many instances of privacy dark patterns – especially those that serve to subvert users autonomy and prevent them from making meaningful choice.

To buttress the above assertion, the EDPB has, in the above-referenced Guidelines on Consent, clarified that consent cannot be said to be freely given where “access to services and functionalities [is] made conditional on the consent”;110 or where “the data subject is unable to refuse or withdraw his or her consent without detriment”111 or otherwise “feels compelled to consent or will endure negative consequences if they do not consent”;112 or where “… the process/procedure for obtaining consent does not allow data subjects to give separate consent for personal data processing operations respectively (e.g. only for some processing operations and not for others)”113 thereby prohibiting the: (x) You Can Run But You Can’t Hide or Impenetrable Wall; (y) Rewards and Punishment; (z) and Bundling of Services and Lack of Granular Choices privacy dark patterns respectively. Privacy Zuckering and Misleading or Unbalanced Information privacy dark patterns are also prohibited since informed consent entails the provision of clear, accurate, and unambiguous “information to data subjects prior to obtaining their consent… to enable them to make informed decisions, understand what they are agreeing to, and for example exercise their right to withdraw their consent”.114

Also, Bad Defaults or Default Settings, Permissions and Always-On Settings, and Default Sharing are prohibited by the Regulation, since consent is not unambiguous unless it “… requires a statement from the data subject or a clear affirmative act, which means that it must always be given through an active motion or declaration”.115 As the EDPB puts it: “… [c]ontrollers must avoid ambiguity and must ensure that the action by which consent is given can be distinguished from other actions. Therefore, merely continuing the ordinary use of a website is not conduct from which one can infer an indication of wishes by the data subject to signify his or her agreement to a proposed processing operation”.116

Finally, Fogging Identification with Security, Sweet Seduction, and Deceptive Click Flow are all prohibited by the Regulation since consent must be specific, and regulated entities “… must apply [respectively]: i [p]urpose specification as a safeguard against function creep, ii [g]ranularity in consent requests, and iii [c]lear separation of information related to obtaining consent for data processing activities from information about other matters.”117

Besides the GDPR’s provisions on consent just examined, the Regulation’s requirements on privacy by design and privacy by default also tackles the threats posed by privacy dark patterns. Thus, Recital 78 states that: “When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfil their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications… ”118 And Article 25 requires data controllers to ensure that they process “… only personal data which are necessary for each specific purpose of the processing. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility”119

Also, in their Guidelines on Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, the EDPB has insisted that “[d]ata processing information and options should be provided in an objective and neutral way, avoiding any deceptive or manipulative language or design”.120 This requirements would prohibit, for examples, the following privacy dark patterns: Address Book Leeching, Immortal Accounts, Default Sharing and Hidden Default Settings, Permissions and Always-On Settings.

That the GDPR tackles privacy dark patterns through the above requirements highlights the GDPR’s comprehensiveness and exhaustiveness, and so notwithstanding the absence of explicit prohibitions in the GDPR and other EU privacy laws, it is hard to insist that there is a legal vacuum regarding privacy dark patterns in Europe. Thus, one may argue, as many have, that there is really no need to reinvent the wheel in Europe and that enforcing compliance is the next best step.121 As one legal expert puts it: “I have the impression that, all in all, at the legal level the European Union is equipped to deal with [privacy] dark patterns, despite an absence of explicit references. It would be good if the [EDPB] would intervene – with interpretations geared towards fairness and examples relating to dark patterns…” A similar sentiment is expressed by Rossana Ducato, a researcher with the Catholic University of Louvain in Belgium, who also thinks that while there is no legal vacuum in Europe, some form of progressive, dedicated guidelines or interpretations are needed to buttress the issue of privacy dark patterns. As Ducato puts it: “… in Europe there is no legal vacuum. However, there are still various problems, such as … the problem of definitions and distinguishing the various types of dark patterns. This is why opening a broad, serious debate… and eventually drafting common guidelines, is likely to be the more elegant strategy. We don’t need an umpteenth directive: we need to work on the application of the laws and user awareness.”122

I agree with the above views to a large extent. But of course more can and should probably be done in Europe in terms of regulating privacy dark patterns. For instance, the EDPB has recognized that the GDPR’s provisions on data protection by default and by design are seriously limited in that they apply only to impose an obligation on controllers and not to the developers of those products and technology used to process personal data.123 Agreeing with the EDPB, the European Consumer Organization has tagged this observation a “legislative gap”124 and “a major vacuum of the GDPR”.125 The European Consumer Organization has stated, in addition, that “third parties who may possibly intervene in the product/service development – including manufacturers, product developers, application developers and service providers – are only mentioned in recital 78 which does not place a requirement on them to comply with [data protection by design and default], as this remains with the controller”.126 This makes sense especially when one considers the now-common possibility of a third-party entity designing the web or mobile application which a controller uses or otherwise offering such platform (which the controller may, if it so wishes, choose to customize with its own “look and feel”) on a software-as-a-service basis.

And then in 2019, Giovanni Buttarelli, who served as the European Data Protection Supervisor, acknowledged that despite existing regulations, privacy dark patterns “circumvent [data protection] principles by ruthlessly nudging consumers to disregard their privacy and to provide more personal data than necessary”127 and stressed that privacy dark patterns remain “a problem that has to be addressed”128 because not only do they violate the spirit of the GDPR, they also offend European values.129 While specific regulatory measures or provisions will definitely be the more preferable option, it is safe to say – considering the efforts and work that go into legislating on a continental basis – that, for the foreseeable future, enforcing compliance with the GDPR and issuing more definitive guidelines is probably the most efficient means to fight privacy dark patterns in the EU.130

On that last observation, it is quite interesting to note that very recently, on 14 March 2022, the EDPB adopted “Guidelines on dark patterns in social media platform interfaces: how to recognise and avoid them”,131 which “offer practical recommendations to designers and users of social media platforms on how to assess and avoid so-called “dark patterns” in social media interfaces that infringe on GDPR requirements”.132 The Guidelines – which represent a huge step forward in the regulation of privacy dark patterns in the EU and reemphasize the EU’s leadership in information privacy protection on the global stage – essentially extinguish any doubts regarding the cumulative effect of strict consent provisions, privacy by design/default requirements, and data protection principles in the GDPR.

Notably, the EDPB defined (privacy) dark patterns as “interfaces and user experiences implemented on social media platforms that cause users to make unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”,133 which is in consonance with the approach in the US (discussed below), where the metric for determining privacy dark pattern is objective – i.e., based on the function or effect of a design pattern as opposed to the intention or aim of the designer.

The Approach in the United States

Lacking the comprehensiveness that characterizes the GDPR and the Guidelines issued to clarify, reemphasize, and broaden the scope of the GDPR (all of which combined make canvassing for “an umpteenth directive”134 a tough, impractical, and unnecessary move), the US is adopting a different approach to regulating privacy dark patterns.

As background, there is not in the US a single, comprehensive federal privacy law. There are, however, a number of industry-focused federal privacy laws, including the Health Insurance Portability and Accountability Act (which establishes privacy and security rules for the processing and disclosure of protected health information), the Children’s Online Privacy Protection Act (designed to limit the collection and processing of children’s personal information), and the Gramm-Leach-Bliley Act (which limits when a financial institution may disclose a consumer’s non-public personal information to non-affiliated third parties). Completing the patchwork of vertical federal privacy regulations are a “new generation of consumer-oriented privacy laws coming from the states”135 with California leading the way and Colorado, Virginia, and other states following closely.

As evidence of its pioneer status, California became the first state to write privacy dark patterns into law when the California Privacy Rights Act (“CPRA”)136 was approved in 2020. The CPRA amends and extends California’s landmark privacy law – the California Consumer Privacy Act (“CCPA”) of 2018.137

In its definition of consent, the CPRA states that the “[a]cceptance of a general or broad terms of use or similar document that contains descriptions of personal information processing along with other, unrelated information, does not constitute consent. Hovering over, muting, pausing, or closing a given piece of content does not constitute consent. Likewise, agreement obtained through use of dark patterns does not constitute consent.138 The law then defines “dark pattern” as a “user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation”139 and stipulate that a business subject to the law must “ensure that any link to a webpage or its supporting content that allows the consumer to consent to opt-in… does not make use of any dark patterns”140 The CPRA goes further to also require, in certain circumstances, that: (x) consent webpage be designed to allow “[a] consumer or a person authorized by the consumer to revoke [their] consent as easily as it is affirmatively provided”;141 (y) “the link to the webpage [must] not degrade [a] consumer’s experience on the webpage the consumer intends to visit and [must] have a similar look, feel, and size relative to other links on the same webpage”;142 and “the consent webpage [must comply] with technical specifications set forth in regulations adopted [under the CPRA]”.143

In July 2021, Colorado joined California and Virginia to become the third US state to pass a comprehensive data privacy legislation – the Colorado Privacy Act (CPA).144 Signed into law on 8 July 2021, and scheduled to go into effect on 1 July 2023, the CPA also specifically prohibits dark patterns by adopting languages similar to that of the CPRA.145 In its own definition of consent, the CPRA states that the “(a) acceptance of a general or broad terms of use or similar document that contains descriptions of personal data processing along with other, unrelated information; (b) hovering over, muting, pausing, or closing a given piece of content; and (c) agreement obtained through dark patterns” do not constitute consent.146 As the CPRA, the CPA also defines “dark pattern” as a “user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice”.147

By explicitly prohibiting personal information collection and processing “agreement obtained through dark patterns”148 and defining dark patterns as design patterns that subvert users’ autonomy, the CPRA and the CPA adopt the twin threshold of benchmarking problematic privacy dark patterns against the user autonomy subversion and regulatory objectives weakening metrics. As King and Stephan put it in their brilliant analysis of the CPRA: “[a] focus on regulatory objectives emphasizes the necessity of using a performance-based standard. But grounding a definition of dark patterns on interference with individual autonomy provides a basis for evaluation – one based on human-centered design… [adopting this twin approach] opens [the] door for regulators to avail themselves of a rich set of methods, tools, and research literature on which to evaluate consent mechanisms.149

Also, it is worthy to note that, in defining dark patterns, the CPRA and the CPA focus on the function or effect of UI design patterns, as opposed to the intent of the UI designer or platform provider. This approach aligns with the view that unburdening or decoupling the determination of dark patterns from designers’ intents acknowledges the power of choice architecture and makes the statement that if digital platforms designers could conceive infinite ways to design their platforms and they choose the privacy-intrusive forms, their designs should qualify as dark patterns anyway, whether or not there was intent or resolve on their part to sabotage people’s autonomous agency and undermine the requirements of applicable privacy laws. Since companies and digital platforms providers have long invested in technologies and sciences that help them exploit choice architecture, it seems only fair to now demand that those companies begin to invest in compliance and shoulder the responsibility of eliminating any subversive or potentially manipulative design choices or elements on their platforms. These legislation, therefore, make the silent statement that there is the need to move away from a passive regime where technology companies can successfully argue that they did not intend to manipulate users or did not realize that an implementation of privacy features constitute dark patterns to an active regime where these companies pay attention to the design of privacy features as a matter of regulatory compliance.

Meanwhile, before the approval of the CPRA in November 2020, a number of regulations governing compliance with the CCPA had already been issued and became effective in August 2020, with additional amendments to the regulations becoming effective in March 2021. Article 2 of the CCPA Regulations (on Notice to Consumers) bans certain categories of privacy dark patterns, especially those that have “the substantial effect of subverting or impairing a consumer’s choice to opt-out”150 even though it does not use the words “dark patterns”. To illustrate, the Regulation forbids: “business’s process for submitting a request to opt-out [which] require[s] more steps than that business’s process for a consumer to opt-in to the sale of personal information after having previously opted out”;151 (otherwise called Ease152); requires that a business must refrain from using “confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”), when providing consumers the choice to opt-out”153 (otherwise called Trick Question);154 and “Upon clicking the “Do Not Sell My Personal Information” link, the business shall not require the consumer to search or scroll through the text of a privacy policy or similar document or webpage to locate the mechanism for submitting a request to opt-out”155 (otherwise called Obfuscating Consent).156

To conclude this part, it is worthy to note that before the CCPA and the CPRA,157 federal lawmakers in the US had introduced the United States Consumer Data Privacy Act, which contained, among other things, provisions requiring covered entities to publish transparent privacy notices and to refrain from denying goods/services to individuals who exercise their privacy rights and the Deceptive Experiences To Online Users Reduction Act or the DETOUR Act, which required, among other things, that in order to obtain consent or personal information, interfaces must not subvert or undermine user autonomy, decision-making, or choice. Although these legislation were never passed, parts of them have been consolidated with another legislation, the Filter Bubble Transparency Act,158 and all three legislation have now morphed into the US Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act159 or the SAFE DATA Act – “the strongest piece of privacy legislation put forth by Senate Republicans to date”,160 which, if passed, “would create a single federal standard for consumer data privacy and pre-empt all state consumer data privacy laws”.161

Although the SAFE DATA Act does not expressly prohibit the use of privacy dark patterns, it does, however: (x) requires that privacy policies be published and “made available, in a clear and conspicuous manner, to the public”;162 (y) prohibits the inference of consent “from the inaction of [an] individual or [an] individual’s continued use of a service or product…”163; (z) requires the conduct of privacy impact assessment to determine the extent to which “any customizable privacy settings included in a service or product offered by the covered entity are adequately accessible to individuals who use the service or product and are effective in meeting the privacy preferences of such individuals”164 and the extent to which privacy settings “provide an individual with adequate control over the individual’s covered data”.165 The SAFE DATA Act is yet to be passed into law.

PART 3

This part relies on the exposition and analytical viewpoints in Parts 1 and 2 of this paper respectively as a base from which to launch a critique of current privacy regime in Canada and offer recommendations on the means to curb the proliferation and use of privacy dark patterns in Canada.

On Canada’s Information Privacy Regime166

Privacy legislation in Canada are generally divided into those applicable to the public and private sectors and those applicable at the federal and provincial/municipal level. There are also a number of sector-specific laws, such as the Bank Act and provincial laws regulating credit unions’ dealings with credit and personal information.

In the public sector, the major information privacy legislation is the Privacy Act,167 which was passed in 1983 and which applies to personal information that federal government institutions168 collects, uses, and discloses. Since the ‘80s, Canadian provinces have passed their own public-sector statutes, often combining privacy and access to information provisions into a single legislation, almost uniformly entitled “Freedom of Information and Protection of Privacy Acts”.169

In the private sector, the federal-level Personal Information Protection and Electronic Documents Act (“PIPEDA”)170 is the primary law governing the collection, use, and disclosure of personal information by private entities in the course of for-profit, commercial activities across Canada.171 Some provinces, i.e., Alberta, British Columbia, and Quebec have enacted their own general private-sector legislation that have been deemed substantially similar with PIPEDA172, meaning that PIPEDA no longer applies in those provinces. (It is crucial to mention, of course, that even in Alberta, British Columbia, and Quebec, PIPEDA still applies to ‘federal works, undertakings, and businesses’ and other private organizations ordinarily subject to provincial privacy laws where those organizations conduct inter-provincial or international use, disclosure, or transfer of data.)

Meanwhile, health information is specially protected in Canada, owing largely to its sensitivity. At the federal level, PIPEDA still applies to organizations that collect, use, and disclose health information. Many provinces also have their own health information privacy statutes, however, only the health information privacy statutes of New Brunswick, Newfoundland and Labrador, Nova Scotia, and Ontario have been declared substantially similar to PIPEDA. Thus, while PIPEDA no longer regulates health information in these provinces (at least to the extent that the health information is covered by these provinces’ statutes), other provinces without a health information privacy statute or with a statute that has not been deemed substantially similar with PIPEDA are still subject to both PIPEDA and to their provincial health information privacy statute.173

Given this paper’s emphasis on private, commercial entities, it seems appropriate to streamline the following discourse to the private-sector information privacy regime in Canada, particularly, PIPEDA – partly due to PIPEDA’s international application, influence, and status as the cornerstone of Canada’s private-sector privacy regime.

Privacy Dark Patterns and PIPEDA

PIPEDA174 was “born in the late 1990s at a time when e-commerce was largely a curiosity and companies such as Facebook did not exist”,175 Outdated and outmoded, the law has been criticized for being overly flexible, loosely worded, and “difficult to understand”,176 and for failing to take a rights-based approach to protecting Canadians’ privacy.177 When compared to the California’s CPRA (or the Colorado’s Privacy Act), and the EU GDPR (considered in Part 2), PIPEDA neither explicitly prohibits privacy dark patterns (as is done under the CPRA in the US) nor does it contain strong, carefully crafted, and prescriptive provisions that, when taken together truly prohibit privacy dark patterns by ensuring user autonomy, guaranteeing meaningful choices, and requiring privacy by design (as is done under the GDPR and the recently released EDPB Guidelines on dark patterns in social media platform interfaces).

As the GDPR, PIPEDA also recognizes ‘knowledge and consent’ (or notice and choice) as a central basis for collecting individuals’ personal information; but unlike the GDPR, PIPEDA singles out consent as the cornerstone and default ground for information collection from which broad exceptions are carved out. This is a problem on its own. However, while consent may not be the perfect legal basis for information processing,178 I focus here not so much on its weakness – seeing as it continues to be at the heart of privacy regimes across the globe, despite its inherent flaws – but on its relative porousness and permissibility under PIPEDA.

Thus, while companies generally have the obligation to obtain valid consent, consent under PIPEDA need not be express, and implied consent is generally considered appropriate (except where the information is “likely” to be considered sensitive; could fall outside the reasonable expectations of the individual; and/or create a meaningful residual risk of significant harm for the individual).179 Implied consent is signalled through conduct, as opposed to a clear affirmative action – so, for instance, consent to collect one’s Internet Protocol address or system’s information may be implied where one simply accesses a website. Yet one cannot possibly evaluate or have knowledge of the privacy policy of that website until access is gained, meaning “that the terms of an implied-consent policy contain a “Catch-22” implications: [t]he user must accept the [privacy] policy before he or she may read it”.180

Viewed from this lens, the implied consent provisions in PIPEDA undermines the idea of a “real” consent or meaningful choice, and fails completely to curb the use of privacy dark patterns, such as Hidden Default Settings and You Can Run But You Can’t Hide, both of which work without an individual’s choice or input. Indeed, one may even argue positively that implied consent under PIPEDA enables privacy dark patterns since it makes it possible to collect from people personal information they would rather not give out. To be sure, Section 6(1) of PIPEDA does benchmarks the validity of consent – whether obtained expressly or impliedly – against the ‘reasonable expectation’ of individuals; however, as experts have observed, “… the idea of ‘reasonable expectations’ is notoriously slippery. It could refer to whether individuals actually expect something, which does not deter bad behaviour as individuals can expect organizations to engage in problematic practices”.181

Again, despite making consent the central basis of information collection, PIPEDA exempts companies from bearing core obligations that make consent meaningful. Thus, as long as it is reasonable to expectthat an individual would understand the nature, purpose, and consequence of personal information collection to which they are consenting, companies are not exactly required – based on Section 5(2) of PIPEDA which exempts companies from bearing an obligation when the word should is used in Schedule 1 of PIPEDA182 – to: specify the purposes for which they are collecting personal information at or before the time of collection to the individual from whom the information is being collected;183 explain to individuals the purposes for which they are collecting personal information; or develop personal information retention and/or destruction policy. As their primary function does not revolve around precluding knowledge but is instead about leveraging choice architecture and human irrationality in ways that undermine “meaningful” consent, most privacy dark patterns techniques, including, for example, Privacy Zuckering, Immortal Accounts, Fogging Identification with Security, and Repeated Nudging easily pass the “nature, purpose, and consequence” test of PIPEDA.

In other words, it is quite possible for certain privacy dark patterns techniques to impact an individual’s decision-making in a setting where although the individual is reasonably expected to understand the nature, purpose, and consequence of information to which he is consenting, that individual is unable to determine on the spot – by reason of the bucket exemptions enjoyed by company by virtue of Section 5(2) – the specific purposes for which their information is going to be used, for how long it is going to be stored, or easily access a company’s information erasure/destruction policy.

Also, considering that privacy dark patterns leverage design elements through choice architecture and work by exploiting lack of procedural clarity in privacy legislation, it is interesting to note that PIPEDA does not contain any provision requiring that digital platforms be designed with privacy in mind. Thus, although the concept of privacy by design originated in Canada184, it does not currently form part of PIPEDA and is therefore not a legal obligation, which is not ideal considering that privacy by design is considered by many “be a, if not the, crucial element in protecting privacy rights meaningfully”.185 As part of their recommendations presented to the House of Commons on PIPEDA, the Standing Committee on Access to Information, Privacy and Ethics had noted that “[o]ne way to improve PIPEDA’s privacy mechanisms is to focus on privacy protection right from the design stage of services and systems.”186 In their own words, “[t]he Committee believes that privacy by design is an effective way to protect the privacy and reputation of Canadians. This proactive, integrated approach should be at the heart of any PIPEDA review.”187

The cumulative effect of the relatively weak knowledge and consent architecture under PIPEDA,188 non-inclusion of privacy-by-design requirements, and failure to adopt a rights-based approach to protecting privacy show how PIPEDA is structurally weak and essentially ineffective in combating privacy dark patterns in Canada. Not only does this unnecessarily ties the hands of regulators, it affords Canadians very little room for redress, and imposes the burden of ambition on courts which seeks a creative and liberal interpretation of PIPEDA to curb the use of a privacy dark pattern or patterns under consideration.

Intended to overhaul PIPEDA and replace it with a modern privacy legislation, Bill C-11 or (“An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Act”), more simply referred to as the Digital Charter Implementation Act, 2020 (now defunct) did not live up to expectation either and has been condemned by Canada’s Office of the Privacy Commissioner as “a step back overall for privacy”.189

For the purpose of our current discourse, it is worthy to note that while the bill does forbid obtaining consent through deceptive or misleading practices190 – a progressive provision – it simultaneously weakens the architecture of knowledge and consent by including broad and unreasonable exceptions which preclude the requirements of knowledge or consent for activities: necessary to provide or deliver a product or service that the individual has requested from the organization; carried out in the exercise of due diligence to prevent or reduce the organization’s commercial risk; for the organization’s information, system or network security; necessary for the safety of products or services provided by the organization; or where obtaining knowledge and consent would be impracticable because the organization has no direct relationship with the individual.191

Even more bafflingly: the limitations imposed in Section 18 (1), which stipulates that an organizations seeking to collect personal information without knowledge or consent can only do so where “(a) a reasonable person would expect such a collection or use for that activity; and (b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions.” are also rendered useless in the same bill through the provisions in Section 62 on openness and transparency, which requires that organizations “make readily available” a “general account” of how the organization uses personal information, a provision that “seems unlikely to be satisfactory both because of the generality of the account that must be provided and the fact that it must only be “readily available” rather than serving a knowledge or notice function”.192

Bill C-11 fails in many other respects193 and it is probably a good thing then that it died on the Order Paper when Parliament was dissolved on 15 August 2021 and the federal election was called. For any new bill introduced in lieu of the defunct Bill C-11, it is hoped that the relevant stakeholders will consider the harms of privacy dark patterns and explore ways to combat them, drawing insights from jurisdictions such as the EU and the US.

At this juncture, it is crucial to note that none of the above criticism of PIPEDA or Bill C-11 is to say that private-sector privacy laws in Canada is entirely deficient in combating privacy dark patterns. For example, PIPEDA does prohibit the collection of personal information through deception194 and the Office of the Privacy Commissioner of Canada’s Guidelines on Consent does recommend that regulated entities “design or adopt innovative and creative ways of obtaining consent, which are just-in-time, specific to the context, and suitable to the type of interface”.195 Also, Alberta and British Columbia’s Personal Information Protection Acts and Quebec’s update to its private-sector privacy law all recognize that consent obtained through deception or misleading practices cannot be valid.196

Given these provisions, one is tempted to jump to the conclusion that these laws have a substantially similar status as the GDPR in that they also contain implicit prohibitions of privacy dark patterns. The problems with that conclusion is that, while useful, these provisions are not nearly as adequate: in other words, the problems with current Canadian private-sector privacy laws is that: (x) they lack, in their letters and interpretations, the prescriptiveness and specificity that characterize the GDPR and its Guidelines, and so standalone provisions prohibiting consent collected through deception cannot truly address the threats posed by privacy dark patterns in its many forms; (y) they fail to require regulated entities to protect privacy by design and by default, two requirements that interplay to help companies build privacy into their product, services, and operations, and which constitute a formidable reference to lessen the inadequacies of privacy policies and the notice-and-choice architecture put forward in privacy scholarship; and (z) due to the clear absence of privacy dark patterns in regulatory and policy conversations in Canada, which is, in a way, an implicit exoneration of companies currently using privacy dark patterns, it is hard to argue, as scholars have argued in favour of the GDPR, that there is no legal vacuum in Canadian privacy laws. But perhaps most importantly, recognizing the great essence of specifically addressing the issue, the EU has gone ahead to issue a dedicated Guidelines on privacy dark patterns: at the very least, this is what Canada must now do.

Regulating Privacy Dark Pattern in Canada: The Way Forward

Canadians value their privacy, even if they do not “feel informed about how their personal information is handled by companies”197 and even where more than half “have not very much or no information at all about [how their personal information is handled]”.198 A move to address threats posed by privacy dark patterns is a great way to champion the privacy right of Canadians, reinvent the wheel or fill the vacuum in current laws, and maintain the adequacy status of our laws vis-Ă -vis the EU GDPR. As things stand, Canada has a great opportunity to evaluate the approaches in the EU and the US and adopt a chiselled solution. I make some recommendations below.

For starters, when reviewing Canadian private-sector privacy laws, such as PIPEDA, Canadians privacy should be positioned as a human, as opposed to consumer, right. Many other jurisdictions, including the EU, already draft their privacy laws from a rights-based perspective; Canada should too. Quoting Dr. Teresa Scassa, the OPC has noted in their comments on Bill C-11 that “[a] human rights-based approach to privacy is one that places the human rights values that underlie privacy protection at the normative centre of any privacy legislation…”199

Further to the above, Canadian privacy laws should seek to balance the interests of average consumers with those of businesses, even whilst recognizing the friction necessarily generated from the power imbalance between the two classes. Thus, in order to promote responsible innovation and protection of Canadians’ personal information, provisions on consent should be firmed up, such that broad and unnecessary exceptions do not create regulatory loopholes, which permit businesses to simply bypass any requirements to obtain clear, unambiguous, and specific consent.200 Also, where feasible, implied consent should be eliminated completely, unless it is possible to clarify the scenarios where it is acceptable: even then, people should always have and maintain the right to opt out easily where their consent is implied. This generally reflects the approach taken by the California’s CCPA considered in Part 2 of this paper.

Where feasible, Canadian privacy laws should explicitly recognize that privacy dark patterns render consent invalid and is, therefore, prohibited. In terms of what it would look like, this explicit recognition should ordinarily entail the definition of dark patterns; the requirement that regulated entities must take active steps to eliminate instances or appearances of privacy dark patterns on their digital platforms and embed privacy in digital products and services by design; the specific barring of the commonest types of privacy dark patterns (as is done under the CPRA, for instance); the stipulation that regulated entities that meet certain, pre-determined benchmarks (e.g., those that routinely collect personal information from more than 10, 000 people in Canada) must file annual returns with the regulators, showing their general state of compliance; and the reaffirmation of the powers of regulators to issue more specific, equally binding guidelines on privacy dark patterns; and the stipulation of specific, graded penalties for regulated entities that fail to comply substantially with the privacy dark patterns provisions.

On the above point, the interpretations and enforcement of privacy dark patterns provisions should be anchored to the twin thresholds of user autonomy and regulatory objectives. In that regard, some perspectives can be borrowed from the US, where privacy dark patterns are defined objectively, in terms of their function or effect of subverting user autonomy, and not subjectively, in terms of the intent of a service provider.201 And then to determine whether enforcement is necessary, the final analysis should be based on whether the dark pattern in question does undermine or seeks to undermine the objectives of relevant privacy regulations.

Outside of regulation, other measures can be adopted to eliminate or reduce to the barest minimum the appearances of privacy dark patterns on digital platforms in Canada. For example, last year, Apple and Google announced the imposition of new privacy requirements on mobile applications available for download in the Apple App Store and Google Play Store, which are intended to, among other things “enable users to initiate deletion of their accounts from within the app itself”202 and make it highly convenient for “Google Play users [to] learn how [a particular] app collects and shares user data before downloading the app”203 Also last year, Apple “released an update for iPhones with a new popup that asked users if they wanted to allow apps on their phones to target the user for ads. IPhone [sic] owners could easily opt-out by tapping a button labelled “Ask App Not to Track””204 and then last month, Google announced that “it was working on privacy measures meant to limit the sharing of data on smartphones running its Android software, [although] the company promised those changes would not be as disruptive as a similar move by Apple last year”.205

While these moves do not directly concern or curb privacy dark patterns, it is hard to miss the fact that Google and Apple are the biggest smartphone software providers in the world and “hold significant sway over what mobile apps can do on billions of devices”.206 It stands to reason, therefore, that an effective way to curb privacy dark patterns’ proliferation – at least on mobile applications – would be to issue clear but flexible privacy design requirements through the application stores operated by these companies. For web applications, a similar approach can be adopted by mandating website hosting services platforms to ensure that web applications hosted on their platforms do not use pre-specified types of privacy dark patterns.

Also, it may be helpful to consider the establishment in Canada of a neutral, third-party certification entity, which reviews the practices of regulated entities – especially multinational entities like Facebook that collects huge amount of Canadians’ personal information – to assess the compliance of privacy-related design and presentation elements on their platforms. This certification entity could compile a list of compliant entities and issue some sort of recognition seals. Overall, this sort of official endorsement can have positive effect on a company’s bottom line by simultaneously boosting the company’s reputation and helping it gain consumers’ trust.

Finally, regulators can also play a crucial role, as part of their chief duty to protect Canadians’ privacy interest. Thus, among other things, regulators should: (x) build expertise as part of their efforts to push back against privacy dark patterns; (y) create public awareness on the existence of and threats posed by privacy dark patterns; (z) create an easily accessible portal for Canadians to report suspected cases of privacy dark patterns; (xx) hold conferences or workshops involving relevant stakeholders as part of effort to educate representatives of regulated entities and offer general guidance; and (yy) issue or approve codes of conducts for designers and product managers – reflecting best practices, common design pitfalls avoidance tips, and non-exhaustive suggestions on how to promote user autonomy – to guide against designing platforms’ choice architectures in ways that constitute or promote privacy dark patterns.

Conclusion

For a legal regime built centrally on consent,207 the Canadian privacy regime, as represented by PIPEDA, does little to curb the proliferation and use of privacy dark patterns on digital platforms in Canada and fails, in certain respects, to impose clear obligations on regulated entities to offset some of the core weaknesses inherent in the knowledge-and-consent framework upon which the Canadian privacy regime is built; yet attempts to review and overhaul federal and provincial private-sector privacy laws have not contemplated or featured privacy dark patterns, and the issue of privacy dark pattern continues to evade the radars in Canada even though they are garnering serious attention in other jurisdictions and prompting – or forming part of – legislative changes.

Cases

Royal Bank of Canada v. Trang [2016] 2 SCR 412

Statutes

A Bill for an Act Concerning Additional Protection of Data Relating to Personal Privacy, Senate Bill 21-190, State of Colorado.

Alberta’s Personal Information Protection Act (S.A. 2003, c P-6.5)

An Act to extend the present laws of Canada that protect the privacy of individuals and that provide individuals with a right of access to personal information about themselves, R.S.C., 1985, c. P-21

An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act, SC 2010, c 23 [CASL]

Bill 64 (An Act to modernize legislative provisions as regards the protection of personal information)

Bill C-11, An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts

British Columbia’s Personal Information Protection Act (R.S.B.C. 2003, c. 63)

California Consumer Privacy Act Regulations, Chapter 20

California Privacy Rights Act of 2020, codified Cal Civ Code §. 1798.140 (Deering 2021) [Operative Jan. 1, 2023]

Ontario Personal Health Information Protection Act, 2004, S.O. 2004, c. 3, Sched. A.

Personal Information Protection Act, Statutes of Alberta, 2003 Chapter P-6.5

Personal Information Protection and Electronic Documents Act, SC 2000, c 5.

QuĂ©bec’s Act respecting personal information in the private sector (CQLR c P-39.1)

SAFE DATA Act, 116th Congress

Other Authorities

European Data Protection Board, Guidelines 3/2022 on dark patterns in social media platform interfaces: how to recognise and avoid them (version 1.0) adopted on 14 March 2022

European Data Protection Board, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, Version 2.0 (Adopted on 20 October 2020) 18

Guidelines 05/2020 on consent under Regulation 2016/679 (Version 1.1) Adopted 4 May 2020. paragraph 3

Government Reports

House of Commons, Canada, “Report of the Standing Committee on Access to Information, Privacy and Ethics, Towards Privacy By Design: Review of the Personal Information Protection and Electronic Documents Act” (House of Commons 2018) 50

Office of the Privacy Commissioner of Canada, “2020-21 Survey of Canadians on Privacy-Related Issues”

Office of the Privacy Commissioner of Canada, “Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020”

Office of the Privacy Commissioner, “Guidelines for obtaining meaningful consent” (May 2018, revised 13 August 2021)

The Commission nationale de l’informatique et des libertĂ©s: “Shaping Choices in the Digital World – From Dark Patterns to Data Protection: the Influence of UX/UI Design on User Empowerment” (The Commission nationale de l’informatique et des libertĂ©s 2019)

The Consumer Council of Norway, “Deceived by Design: How tech companies use dark patterns to discourage us from exercising our rights to privacy” (The Consumer Council of Norway 2018) Side 13 av 43

The Consumer Council of Norway, “Every Step You Take: How Deceptive Design Lets Google Track Users 24/7” (The Consumer Council of Norway 2018) 4.

Regulations

Directive 2009/136/EC of the European Parliament and of the Council of 25 November 2009 amending Directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC) No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws

EU General Data Protection Regulation (GDPR): Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1

EU, the Directive on Unfair Terms in Consumer Contracts, approved by the European Union in 1993 (93/13/EEC) and Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (“Unfair Commercial Practices Directive”)

Books

Alan F Westin, “Privacy and Freedom” (New York: Atheneum, 1970) 7.

C. Alexander, S. Ishikawa, and M. Silverstein, “A Pattern Language: Towns, Buildings, Construction” (Oxford University Press 1977) x.

Chris Lewis, “Irresistible Apps: Motivational Design Patterns for Apps, Games, and Web-based Communities” (Apress, 2014)

Dan Ariely, “Predictably Irrational: The Hidden Forces That Shape Our Decisions” (1st HarperCollins Canada 2008).

Deckle McLean, “Privacy and its Invasion” (Praeger 1995) 9.

E. Gamma, R. Helm, R. Johnson, and J. Vlissides, “Design patterns: elements of reusable object-oriented software” (Pearson Education 1994).

Gloria González Fuster, “The Emergence of Personal Data Protection as a Fundamental Right of the EU” (Springer International Publishing Switzerland 2014) 28.

Kahneman, Daniel. “Thinking, Fast and Slow” (17th edn, New York: Farrar, Straus and Giroux 2011)

Kris Klein, “Canadian Privacy: Data Protection Law and Policy for the Practitioner Fourth Edition”, International Association of Privacy Professionals, 2020

Richard H. Thaler and Cass R. Sunstein, “Nudge: Improving Decisions about Health, Wealth, and Happiness” (Yale University Press 2008) 43

William J. Brown, Raphael C. Malveau, Hays W. “Skip” McCormick III, Thomas J. Mowbray, “AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis” (John Wiley & Sons, Inc. 1998)

Zuboff, Shoshana, “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power” (New York: PublicAffairs, 2019).

Journal Articles

Amos Tversky & Daniel Kahneman, “Judgment Under Uncertainty: Heuristics and Biases”, 185 (4157) JSTOR 1124-1131.

Arunesh Mathur, Mihir Kshirsagar, and Jonathan Mayer, “What Makes a Dark Pattern... Dark?: Design Attributes, Normative Considerations, and Measurement Methods” (2021) ACM 13.

Carlos Jensen, Colin Potts, “Privacy Policies as Decision-Making Tools: An Evaluation of Online Privacy Notices” (2004) 6 CHI 471, 477.

Cass R. Sunstein, “Behavioral Analysis of Law” (1997), 64 U. Chi. L. Rev. 1175, 1175

Christine Jolls, Cass R. Sunstein, & Richard Thaler, “A Behavioral Approach to Law and Economics” (1998), 50 Stan. L. Rev. 1471, 1473

Christoph Bösch, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher, “Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns” (2016) 4 PPET 237, 239.

Fiona Westin and Sonia Chiasson, “Opt out of Privacy or “Go Home”: Understanding Reluctant Privacy Behaviours through the FoMO-Centric Design Paradigm” (2019) NSPW 57, 58

Fritsch Lothar, “Privacy dark patterns in identity management” (2017) Lecture Notes in Informatics (LNI), Gesellschaft fĂĽr Informatik, Bonn 93

Gregory Conti and Edward Sobiesk, “Malicious Interface Design: Exploiting the User” (2010) In Proceedings of the 19th International Conference

Gregory Day & Abbey Stemler, “Are Dark Patterns Anticompetitive?” (2020) 72 Ala L Rev 1, 14.

Jamie Luguri, Lior Jacob Strahilevitz, “Shining a Light on Dark Patterns” (2021) 13 JLA 43.

Jennifer King & Adriana Stephan, “Regulating Privacy Dark Patterns in Practice – Drawing Inspiration from California Privacy Rights Act” (2021) 5 Geo. L. Tech. Rev. 251, 272

Johanna Gunawan, Amogh Pradeep, David Choffnes, Woodrow Hartzog, and Christo Wilson, “A Comparative Study of Dark Patterns Across Mobile and Web Modalities” (2021) 5 Proc. ACM Hum.-Comput. Interact. 377, 377: 1.

Jon D. Hanson and Douglas A. Kysar, “Taking Behaviouralism Seriously: The Problem of Market Manipulation” (1999), 74 NYULR, 102, 142

Linda Di Geronimo, Larissa Braz, Enrico Fregnan, Fabio Palomba, and Alberto Bacchelli, “UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception” (2020). ACM

Lisa M. Austin, “Is Consent the Foundation of Fair Information Practices? Canada’s Experience Under Pipeda” (2006) 56 University of Toronto Law Journal 181

Richard Warner, “Notice and Choice Must Go: The Collective Control Alternative” (2020) 23 (2) Science and Technology Law Review 173

Website Articles

Andy Green, “Complete Guide to Privacy Laws in the US” (Varonis 2 April 2021) accessed 9 March 2022.

Arunesh Mathur et al., “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites” (Arxiv November 2019) accessed 9 February 2022 accessed 13 February 2022

Bradley Song, “Exploring “Dark Patterns” and the California Privacy Rights Act of 2020” (Gordinier Kang & Kim LLP 14 October 2021) accessed 6 March 2022.

Catherine Clifford, “Meatballs and DIY bookcases: The psychology behind Ikea’s iconic success” (CNBC 5 October 2019) accessed 19 February 2022

Daisuke Wakabayashi, “Google Plans Privacy Changes, but Promises to Not Be Disruptive” (New York Times 16 February 2022) accessed 21 March 2022

Daniel Rosenberg, “The business of UX strategy” (ACM Digital Library 2018) accessed 3 March 2022.

David Krebs, “Canada: Implementing Privacy By Design” (Mondaq 12 November 2018) accessed 17 March 2022.

David Stauss and Stacey Weber, “How do the CPRA, CPA, and VCDPA treat dark patterns?” (Bytebacklaw 16 March 2022) accessed 16 March 2022

European Data Protection Board, “EDPB adopts Guidelines on Art. 60 GDPR, Guidelines on dark patterns in social media platform interfaces, toolbox on essential data protection safeguards for enforcement cooperation between EEA and third country SAs” (2022) accessed 15 March 2022.

Federico Caruso, “Dark patterns: born to mislead” (European Data Journalism Network 13 November 2019) accessed 7 March 2022.

Giovanni Buttarelli, “Legal Design Roundtable, European Data Protection Supervisor” (EDPS 27 April 2019) accessed 11 March 2022

Giovanni Buttarelli, “Legal Design Roundtable, European Data Protection Supervisor” (EDPS.EUROPA 27 April 2019) accessed 11 February 2022

Greg Ferenstein, “The Birth And Death Of Privacy: 3,000 Years of History Told Through 46 Images” (Medium, 24 November 2015) accessed 07 November 2021

Harry Brignull, “Dark Patterns: dirty tricks designers use to make people do stuff” (90 Percent of Everything, 8 July 2010) accessed 11 February 2022

Helena Vieira, “The paradox of wanting privacy but behaving as if it didn’t matter” (LSE 19 April 2018) accessed 7 March 2022

Hinshaw Privacy & Cyber Bytes, “Privacy Bill Essentials: Proposed Federal “Setting an American Framework to Ensure Data Access, Transparency, and Accountability Act”” (HinshawLaw 6 August 2021) accessed 11 March 2022.

Hunton Andrews Kurth, “Apple and Google Announce Effective Dates of New Mobile App Privacy Requirements” (NatLawReview 8 November 2021) accessed 21 March 2022

Jennifer King & Adriana Stephan, “Regulating Privacy Dark Patterns in Practice—Drawing Inspiration from California Privacy Rights Act” (2021) 5 Geo. L. Tech. Rev. 251. See also Dr Timo Klein, “Bits of advice: the true colour of dark patterns” (Oxera 26 November 2021) accessed 4 March 2022

Julian Oliver, Gordan Savicic, and Danja Vasiliev, “The Critical Engineering Manifesto” (The Critical Engineering Working Group Berlin, October 2011-2021) accessed 23 February 2022

Katharina Kopp in Federal Trade Commission, “Bringing Dark Patterns to Light: An FTC Workshop” (FTC 29 April 2021) accessed 2 February 2022

Kif Leswing, “Apple’s ad privacy change impact shows the power it wields over other industries” (CNBC 13 November 2021) accessed 21 March 2022

Maya, “A/B Testing Statistics” (TrueList 1 February 2021) accessed 12 February 2022

Michael Geist, “Canada’s GDPR Moment: Why the Consumer Privacy Protection Act is Canada’s Biggest Privaacy Overhaul in Decades” (MichaelGeist 17 November 2020) accessed 19 March 2022

Michaela Smiley, “More privacy means more democracy” (blog.mozilla 25 April 2019) accessed 7 March 2022.

MĂĽge Fazlioglu, “Consolidating US privacy legislation: The SAFE DATA Act” (IAPP 21 September 2020) accessed 11 March 2022

Natasha Lomas, “WTF is Dark Patterns” (TechCrunch 1 July 2018) accessed 13 March 2022.

Professor Ignacio Cofone, “Policy Proposals for PIPEDA Reform to Address Artificial Intelligence Report” (Office of the Privacy Commissioner of Canada November 2020) accessed 22 March 2022.

Professor Lisa Austin, “Who decides? Consent, meaningful choices, and accountability” (Schwartz Reisman Institute for Technology and Society 22 December 2020) accessed 29 March 2022

Project Dark Patterns Tip Line (Stanford Digital Civil Society Lab 2022) accessed 11 February 2022

Rieger and Sinders, “Dark Patterns: Regulating Digital Design” (Stiftung Neue Verantwortung 13 May 2020) accessed 26 February 2022

Sara Morrison, “How your mobile carrier makes money off some of your most sensitive data” (Vox 13 March 2021) accessed 12 February 2022.

Stephanie Nguyen, “The Impact of Dark Patterns on Communities of Color: An FTC Workshop Panel Discussion Recap” (Medium 17 May 2021) accessed 4 March 2022.

Teresa Scassa, “Replacing Canada’s 20-Year-Old Data Protection Law” (23 December 2020) accessed 15 March 2022

The European Consumer Organization, “BEUC Comments on the EPDB Guidelines 4/2019 on Data Protection by Design and Default” Ref: BEUC-X-2020-003 – 16/01/2020 accessed 1 March 2022

William Rinehart, Caden Rosenbaum, and Amanda Ortega, “Public Interest Comment on the “Bringing Dark Patterns to Light” FTC Workshop” (Center for Growth and Opportunity at Utah State University 22 June 2021) accessed 1 March 2022

Yashasvi Nagda, “What is Darkness in Dark Patterns” (Medium 17 March 2020) accessed 14 March 2022.

Endnotes

1 C. Alexander, S. Ishikawa, and M. Silverstein, “A Pattern Language: Towns, Buildings, Construction” (Oxford University Press 1977) x.
2 See generally E. Gamma, R. Helm, R. Johnson, and J. Vlissides, “Design patterns: elements of reusable object-oriented software” (Pearson Education 1994).
3 Object-oriented design is essentially a way of approaching software design – it helps in designing system architecture and solving a software problem.
4 E. Gamma et al. (n 2) 2.
5 Ibid.
6 Andrew Richard Koenig is a former AT&T and Bell Labs researcher and programmer. In coming up with the word ‘antipattern’, he was inspired by the Gang of Four’s book on Design Patterns.
7 William J. Brown, Raphael C. Malveau, Hays W. “Skip” McCormick III, Thomas J. Mowbray, “AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis” (John Wiley & Sons, Inc. 1998) 6.
8 To explain antipatterns even further, consider the following scenarios: (x) a programmer retain undesirable, redundant, or low-quality code because removing it would be expensive, time-consuming, or have unpredictable consequences; (y) a programmer assumes that a favorite solution is universally applicable in situations that could have been improved with some creativity; (z) a programmer builds software programs whose structure is barely comprehensible, especially because of code structures misuse. These instances of antipatterns are otherwise known as “lava flows, golden hammers, and spaghetti code” respectively.
9 Brown et al. (n 7).
10 Harry Brignull, “Dark Patterns: dirty tricks designers use to make people do stuff” (“90 Percent of Everything”, 8 July 2010) accessed 11 February 2022.
11 Christoph Bösch, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher, “Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns” (2016) 4 PPET 237, 239.
12 Ibid 238.
13 The term “dark patterns” was first coined in 2010 by UX designer, Harry Brignull, who has since maintained a website where he spreads awareness on dark patterns and shames companies that use them.
14 I am aware that the term “dark patterns” has been criticized as promoting the idea that “dark” connotes bad or evil. This is even more so, considering that the antonym of dark patterns – i.e., ‘light patterns’ – connotes and describes the positive, legal, and acceptable sorts of design patterns. To avoid the “dark” in dark patterns, some stakeholders have suggested using the terms “deceptive patterns”, “manipulative patterns”, or “coercive patterns” instead. As an author that identifies as Black, I understand these concerns all too well. Unfortunately, none of the terms I have yet come across adequately covers the breadth of harmful, illegal, or potentially illegal design patterns covered in this paper; and to use any of the suggested terms above would be to derail the direction of this paper. Thus, for the purpose of this paper only and until an equally useful term is adopted, I shall stick with the current – if somewhat unseemly – term. I will ultimately align with such replacement term as concerned parties or relevant stakeholders shall introduce or adopt.
15 Gregory Day & Abbey Stemler, “Are Dark Patterns Anticompetitive?” (2020) 72 Ala L Rev 1, 14.
16 Arunesh Mathur et al., “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites” (Arxiv November 2019) accessed 9 February 2022 accessed 13 February 2022 (citing Colin M. Gray, Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs, ‘The Dark (Patterns) Side of UX Design’ (2018) In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). ACM, New York, NY, USA, Article 534, 14 pages. Cherie Lacey and Catherine Caudwell. 2019. Cuteness as a “Dark Pattern” in Home Robots. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI ’19). IEEE Press, Daegu, Republic of Korea, 374 – 381, inter alia).
17 Johanna Gunawan, Amogh Pradeep, David Choffnes, Woodrow Hartzog, and Christo Wilson, “A Comparative Study of Dark Patterns Across Mobile and Web Modalities” (2021) 5 Proc. ACM Hum.-Comput. Interact. 377, 377: 1.
18 See for examples: Chris Lewis, “Irresistible Apps: Motivational Design Patterns for Apps, Games, and Web-based Communities” (Apress, 2014); Fiona Westin and Sonia Chiasson, “Opt out of Privacy or “Go Home”: Understanding Reluctant Privacy Behaviours through the FoMO-Centric Design Paradigm” (2019) NSPW 57, 58; and Gregory Conti and Edward Sobiesk, “Malicious Interface Design: Exploiting the User” (2010) In Proceedings of the 19th International Conference on World Wide Web (Raleigh, North Carolina, USA) (WWW ’10). Association for Computing Machinery, New York, NY, USA, 271 – 280, 271.
19 Jamie Luguri, Lior Jacob Strahilevitz, “Shining a Light on Dark Patterns” (2021) 13 JLA 43.
20 Jennifer King & Adriana Stephan, “Regulating Privacy Dark Patterns in Practice – Drawing Inspiration from California Privacy Rights Act” (2021) 5 Geo. L. Tech. Rev. 251. See also Dr Timo Klein, “Bits of advice: the true colour of dark patterns” (Oxera 26 November 2021) accessed 4 March 2022.
21 Brignull (n 10).
22 Recently, a group of Princeton University carried out an extensive analysis of shopping websites and identified 1,800 dark patterns (see Arunesh Mathur et al., “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites” (Arxiv November 2019) accessed 9 February 2022) accessed 18 February 2022. Another researcher found that more than 95% of 200 mobile applications contain at least one dark patterns – an outrageous percentage when one considers that the mobile applications in question are 200 of the most popular. (See Linda Di Geronimo, Larissa Braz, Enrico Fregnan, Fabio Palomba, and Alberto Bacchelli, “UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception” (2020). ACM).
23 Day et al. (n 15) 81.
24 See Project “Dark Patterns Tip Line” (Stanford Digital Civil Society Lab 2022) accessed 11 February 2022.
25 Gunawan et al. (n 17) 377:4.
26 Luguri et al. (n 19) 52.
27 Luguri et al. (n 19) 44.
28 Linda Di Geronimo, Larissa Braz, Enrico Fregnan, Fabio Palomba, and Alberto Bacchelli, “UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception” (2020). ACM 4.
29 Ibid.
30 Some have focused instead on the efficiency of patterns as the metric that qualifies them as dark. Thus, Yashasvi Nagda, for instance asserts that the “[d]arkness of a pattern is the efficiency with which the pattern is able to camouflage itself, trick user into stakeholder intended task without causing the loss of customer and significantly cause a known or unknown, immediate or gradual damage, either temporal, monetary, social or experiential to the user”. See Yashasvi Nagda, “What is Darkness in Dark Patterns” (Medium 17 March 2020) accessed 14 March 2022.
31 William Rinehart, Caden Rosenbaum, and Amanda Ortega, “Public Interest Comment on the “Bringing Dark Patterns to Light” FTC Workshop” (Center for Growth and Opportunity at Utah State University 22 June 2021) accessed 1 March 2022.
32 Arunesh Mathur, Mihir Kshirsagar, and Jonathan Mayer, “What Makes a Dark Pattern…Dark?: Design Attributes, Normative Considerations, and Measurement Methods” (2021) ACM 13.
33 Ibid.
34 Ibid.
35 This view is based on the idea that it will be less difficult to regulate (privacy) dark patterns by focusing on evidently measurable and objective factors as opposed to elusive and subjective elements. The burden, therefore, should always be on digital platforms designers and services providers to always be aware of relevant laws’ requirements, curb their thirst to “collect it all” and be deliberate about eliminating instances of dark patterns on their platforms, and always recognize that dark patterns – whether deployed deliberately or not – could have serious implications for people’s fundamental right to privacy.
36 See generally Christine Jolls, Cass R. Sunstein, & Richard Thaler, “A Behavioral Approach to Law and Economics” (1998), 50 Stan. L. Rev. 1471, 1473 (approaching economic analysis through conception of choice that reflects improved understanding of human behavior); Cass R. Sunstein, “Behavioral Analysis of Law” (1997), 64 U. Chi. L. Rev. 1175, 1175 (theorizing that the future of economic analysis depends upon new understandings of decision making developed through behavioural research); Amos Tversky & Daniel Kahneman, “Judgment Under Uncertainty: Heuristics and Biases”, 185 (4157) JSTOR 1124-1131.
37 Jon D. Hanson and Douglas A. Kysar, “Taking Behaviouralism Seriously: The Problem of Market Manipulation” (1999), 74 NYULR, 102, 142.
38 See generally Dan Ariely, “Predictably Irrational: The Hidden Forces That Shape Our Decisions” (1st HarperCollins Canada 2008). See also: Kahneman, Daniel. “Thinking, Fast and Slow” (17th edn, New York: Farrar, Straus and Giroux 2011).
39 Richard H. Thaler and Cass R. Sunstein, “Nudge: Improving Decisions about Health, Wealth, and Happiness” (Yale University Press 2008) 43.
40 Ibid. 86.
41 Ibid. 43.
42 Catherine Clifford, “Meatballs and DIY bookcases: The psychology behind Ikea’s iconic success” (CNBC 5 October 2019) accessed 19 February 2022.
43 Julian Oliver, Gordan SaviÄŤić, and Danja Vasiliev, “The Critical Engineering Manifesto” (The Critical Engineering Working Group Berlin, October 2011-2021) accessed 23 February 2022.
44 Gunawan et al. (n 17).
45 Thaler and Sunstein (n 39) 85.
46 Also known as split testing, A/B testing entails an experiment where two or more versions of a page element, functionality, design, etc. of a website or mobile application is shown to select participants or actual users at random to determine which version works best in achieving a particular goal. A/B testing helps eliminate guesswork in optimizing web or mobile applications. The A/B testing software market was worth $485 million in 2018. By 2025, the global A/B testing software market is projected to be worth $1.08 billion. See Maya, “A/B Testing Statistics” (TrueList 1 February 2021) accessed 12 February 2022.
47 Hanson and Kysar (n. 37) 142.
48 A/B testing can be used to determine things like: what button size is most effective in prompting participants/users to click on an ad or what font type, font size, and color makes participants interact with a mobile application the most?
49 Without it being apparent, users help companies improve their platforms all the time. As Sebastian Rieger and Caroline Sinders claim, “Users usually do not notice when they participate in A/B tests, because website elements are changed during normal operation and only for a limited number of users”. See: Rieger and Sinders, “Dark Patterns: Regulating Digital Design” (Stiftung Neue Verantwortung 13 May 2020) accessed 26 February 2022.
50 As one author eloquently puts it: “[digital platform providers] figured out a long time ago that they have two ways of making money off of their customers: what those customers pay to use their services, and then, what carriers earn by selling the data those paying customers provide as they use those services. The former is clear and obvious to the customer, especially when the monthly bill comes due. The latter is buried under lengthy and confusing privacy policies and account settings, and most customers don’t even know it’s happening”. See Sara Morrison, “How your mobile carrier makes money off some of your most sensitive data” (Vox 13 March 2021) accessed 12 February 2022.
51 Thaler and Sunstein (n. 40) 76.
52 Deckle McLean, “Privacy and its Invasion” (Praeger 1995) 9.
53 Greg Ferenstein, “The Birth And Death Of Privacy: 3,000 Years of History Told Through 46 Images” (Medium, 24 November 2015) accessed 07 November 2021.
54 Michaela Smiley, “More privacy means more democracy” (blog.mozilla 25 April 2019) accessed 7 March 2022.
55 These are information privacy [i.e., the “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” See Alan F Westin, “Privacy and Freedom” (New York: Atheneum, 1970) 7]; territorial privacy [i.e., ability of an individual or organization to intrude into another individual’s physical environment]; and personal privacy [i.e., freedom from invasion of the person or body].
56 See Alan F Westin, “Privacy and Freedom” (New York: Atheneum, 1970) 7.
57 See generally Zuboff, Shoshana, “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power” (New York: PublicAffairs, 2019).
58 Gloria González Fuster, “The Emergence of Personal Data Protection as a Fundamental Right of the EU” (Springer International Publishing Switzerland 2014) 28.
59 Katharina Kopp in Federal Trade Commission, ‘Bringing Dark Patterns to Light: An FTC Workshop’ (FTC 29 April 2021) accessed 2 February 2022.
60 Throughout the rest of this paper, I use the word “privacy dark patterns” to move the conversation away from a discussion of dark patterns generally and limit the scope of the paper only to those “species of dark patterns that nudge, manipulate, or coerce people into divulging more personal information than they really intend to when using digital platforms.”
61 To nudge users, the “Accept” button in privacy notices are noticeably brighter and easily accessed, while the “Manage Options” button is often greyed out and involves several clicks before users can manage their privacy options. Because defaults are powerful and constitute implicit recommendations, many platforms privacy choices are set to the most privacy-intrusive and enables the collection of unnecessary information. Understanding that users love ease and convenience, certain designs choices or elements are implemented that (can) frustrate or confound users into giving away their information. And knowing how much framing matters, privacy options are presented in forms that almost predictably leads users to consent to the collection of their information.
62 Kopp (n 59).
63 The Consumer Council of Norway, Deceived by Design: “How tech companies use dark patterns to discourage us from exercising our rights to privacy” (The Consumer Council of Norway 2018) Side 3 av 43.
64 The European Consumer Organization, “BEUC Comments on the EPDB Guidelines 4/2019 on Data Protection by Design and Default” Ref: BEUC-X-2020-003 - 16/01/2020 accessed 1 March 2022.
65 Stephanie Nguyen, “The Impact of Dark Patterns on Communities of Color: An FTC Workshop Panel Discussion Recap” (Medium 17 May 2021) accessed 4 March 2022.
66 For instance, “Facebook was forced to make changes to its ad platform after it was shown that an ad-targeting category it lets advertisers target ads against, called “ethnic affinity” – aka Facebook users whose online activity indicates an interest in “content relating to particular ethnic communities” – could be used to run housing and employment ads that discriminate against protected groups.” See Natasha Lomas, ‘WTF is Dark Patterns” (TechCrunch 1 July 2018) accessed 13 March 2022.
67 Nguyen (n 65).
68 The European Data Protection Supervisor (EDPS) is an independent supervisory authority established in accordance with Regulation (EU) No 2018/1725, on the basis of Article 16 TFEU. The EDPS is responsible for ensuring that the fundamental rights and freedoms of individuals, in particular their privacy, are respected when the EU institutions and bodies process personal data.
69 Giovanni Buttarelli, “Legal Design Roundtable”, European Data Protection Supervisor (EDPS.EUROPA 27 April 2019) accessed 11 February 2022.
70 See Daniel Rosenberg, “The business of UX strategy” (ACM Digital Library 2018), accessed 3 March 2022.
71 As shown in Part 1, choice architecture also applies, of course, to the design of physical spaces.
72 See Harry Brignull, “Types of Dark Pattern” (Dark Pattern 2022) accessed 3 March 2022.
73 Other taxonomies on dark patterns have been developed that do not focus on privacy, including, for instances, those developed based on the analyses of malicious interface techniques by Conti and Sobiesk; those developed in the context of designer intents by Gray et al.; and those developed on online manipulation techniques by Mathur et al.
74 Bösch et al. (n 11) 239.
75 Ibid 248.
76 Ibid.
77 Ibid 251.
78 Ibid 250.
79 See generally Fritsch Lothar, “Privacy dark patterns in identity management” (2017) Lecture Notes in Informatics (LNI), Gesellschaft fĂĽr Informatik, Bonn 93.
80 Ibid.
81 Ibid 95.
82 Ibid 97.
83 Ibid 99.
84 The Consumer Council of Norway, “Every Step You Take: How Deceptive Design Lets Google Track Users 24/7” (The Consumer Council of Norway 2018) 4.
85 Ibid 26.
86 Ibid 27.
87 Ibid
88 Ibid 28.
89 Ibid 29.
90 Ibid 30.
91 The Consumer Council of Norway, “Deceived by Design: How tech companies use dark patterns to discourage us from exercising our rights to privacy” (The Consumer Council of Norway 2018) Side 13 av 43.
92 Ibid. Side 19 av 43.
93 Ibid. Side 22 av 43.
94 Ibid. Side 25 av 43.
95 Ibid. Side 31 av 43.
96 The Commission nationale de l’informatique et des libertĂ©s: “Shaping Choices in the Digital World – From Dark Patterns to Data Protection: the Influence of UX/UI Design on User Empowerment” (The Commission nationale de l’informatique et des libertĂ©s 2019) 29.
97 Ibid. 28.
98 Ibid. 29.
99 Ibid.
100 Ibid. 28.
101 It is important to note here that apart from privacy laws, other laws also regulate dark patterns. Interestingly, competition and consumer protection laws have been the traditional legal weapon to resist or regulate most forms of dark patterns. For example, in the EU, the “Directive on Unfair Terms in Consumer Contracts, approved by the European Union in 1993 (93/13/EEC)” and “Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (“Unfair Commercial Practices Directive”)” were one of the first regulations to address dark patterns. There is also some move to incorporate dark patterns regulation into a new regulation proposed by the European Commission – the Digital Services Act – intended to modernize the 2000 EU e-Commerce Directive, and which addresses dark patterns in the context of certain provisions in another legislative proposal, the Digital Markets Act. In the US, the Federal Trade Commission (“FTC”) has leaned on Section 5 of its establishing Act to pursue actions against ‘unfair or deceptive acts or practices’ and so, in 2019, the FTC concluded that Cambridge Analytica misled Facebook users because the former “engaged in deceptive practices to harvest personal information from tens of millions of Facebook users for voter profiling and targeting”, and then issued a record-breaking fine of $5 billion to Facebook as punishment for the user privacy violations revealed by the Cambridge Analytica case. The FTC has also recently issued the “Enforcement Policy Statement Regarding Negative Option Marketing”, warning companies against using dark patterns that trick consumers into subscription services.
102 EU General Data Protection Regulation (GDPR): Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1. (“GDPR”).
103 This is the foundation upon which most information privacy laws are built and is not to be confused with the concept of choice architecture in digital and physical design discussed in part 1 of this paper. The notice-and-choice architecture being discussed here describes the central obligations of those who collect people’s personal information and, thus, subject to the law (usually companies), to provide sufficient information (or notice) to those people from whom the personal information is being collected (usually individuals or natural persons), so that those people can make informed decision (or choice) regarding how and to what extent they want to share or disclose their personal information.
104 “Processing” is defined by the GDPR as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”. See Article 4(2) of the GDPR.
105 Although consent is at the heart of the GDPR, it is important to note that there are other equally valid legal bases for collecting personal information under the Regulation: legitimate purpose, legal obligation, vital interest, public interest, and performance of contract.
106 This means that consent cannot be implied or implemented based on an opt-out mechanism, cannot be bundled or tied to the provision of a service, and necessarily requires the determination of a specific, explicit and legitimate purpose for the intended processing activity.
107 Recital 32 of the GDPR.
108 Emphasis added.
109 See Guidelines 05/2020 on consent under Regulation 2016/679 (Version 1.1) Adopted 4 May 2020. paragraph 3.
110 Ibid. paragraph 39.
111 Ibid. paragraph 13.
112 Ibid.
113 Ibid. Paragraph 43
114 Ibid. Paragraph  See also Article 66 of the “Directive 2009/136/EC of the European Parliament and of the Council of 25 November 2009 amending Directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC) No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws” (which requires that when storing cookies on users devices, which effectively grants access to information contained on such devices it is “of paramount importance that users be provided with clear and comprehensive information… [and] [t]he methods of providing information and offering the right to refuse should be as user-friendly as possible”).
115 Ibid. (n 109) paragraph 75. Emphasis added.
116 Ibid. paragraph 84. Emphasis added.
117 Ibid. (n 109) paragraph 75.
118 See also Recital 108 and Article 25 (on Data protection by design and by default) of the GDPR.
119 Article 25 (2) GDPR.
120 See European Data Protection Board, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, Version 2.0 (Adopted on 20 October 2020) 18. Emphasis added.
121 At least one data protection authority, the French Data Protection Agency, Commission Nationale de l’Informatique et des LibertĂ©s – CNIL, has imposed a (€50 million) penalty (on Google) for lack of transparency, unsatisfactory information and lack of valid consent for advertising personalisation.
122 Ibid.
123 Ibid. (n 120).
124 Ibid. (n 64).
125 Ibid.
126 Ibid.
127 Giovanni Buttarelli, “Legal Design Roundtable”, European Data Protection Supervisor (EDPS 27 April 2019) accessed 11 March 2022.
128 Ibid.
129 Ibid.
130 Of course, nothing stops individual states in the EU from enacting legislation on privacy dark patterns. The GDPR confirms that “Member States [are] allowed to maintain or introduce national provisions to further specify the application of the rules of this Regulation. [The GDPR] also provides a margin of manoeuvre for Member States to specify its rules, including for the processing of special categories of personal data (“sensitive data”). To that extent, this Regulation does not exclude Member State law that sets out the circumstances for specific processing situations, including determining more precisely the conditions under which the processing of personal data is lawful.” See Recital 10 of the GDPR.
131 European Data Protection Board, Guidelines 3/2022 on dark patterns in social media platform interfaces: how to recognise and avoid them (version 1.0) adopted on 14 March 2022.
133 Ibid. (n 132) 2.
134 Ibid. (n 107).
135 Andy Green, “Complete Guide to Privacy Laws in the US” (Varonis 2 April 2021) accessed 9 March 2022.
136 California Privacy Rights Act of 2020, codified Cal Civ Code §. 1798.140 (Deering 2021) [Operative Jan. 1, 2023] accessed 5 March 2022.
137 The CPRA also contains new provisions, including the establishment of the California Privacy Protection Agency, vested with “full administrative power, authority, and jurisdiction to implement and enforce” the CCPA. Enforcement of the CPRA will not begin until July 1, 2023, and enforcement will apply only to violations occurring on or after that date.
138 Section 1798.140, Subdivision h. CPRA
139 Subdivision l. Section 9 CPRA
140 Section 1798.185, Subdivision a, paragraph 20 (C) CPRA
141 Section 1798.135, Subdivision b, paragraph 2 CPRA
142 Ibid.
143 Ibid.
144 A Bill for an Act Concerning Additional Protection of Data Relating to Personal Privacy, Senate Bill 21-190, State of Colorado.
145 Given that consent is not required under the CCPA (which the CPRA amends), except under three very specific circumstances, the “prohibition on use of dark patterns in obtaining consent is arguably more significant in the CPA because the CPA requires consent for collection of sensitive data, which includes several specific categories of data as well as personal data from a known child. As a result, any controller that collects sensitive data in Colorado will be required to comply with this provision and any additional regulations regarding dark patterns.” (see David Stauss and Stacey Weber, “How do the CPRA, CPA, and VCDPA treat dark patterns?” (Bytebacklaw 16 March 2022) accessed 16 March 2022. It should be noted, however, that while there isn’t any specific demand under the CCPA for a business to gain a specific opt-in before collecting or sharing data, it does require that companies provide consumers with the ability to opt-out.
146 Ibid. (n 145) Section 5
147 Ibid. Section 9
148 Ibid. Section 5
149 Jennifer King & Adriana Stephan, “Regulating Privacy Dark Patterns in Practice – Drawing Inspiration from California Privacy Rights Act” (2021) 5 Geo. L. Tech. Rev. 251, 272. (Citing Arunesh Mathur, Mihir Kshirsagar, and Jonathan Mayer, “What Makes a Dark Pattern…Dark?: Design Attributes, Normative Considerations, and Measurement Methods” (2021) ACM 13 at 17.)
150 Section 999.315. h, “California Consumer Privacy Act Regulations”, Chapter 20 available at accessed 29 March 2022.
151 Section 999.315. h(1) California Consumer Privacy Act Regulations.
152 This privacy dark pattern describes a situation where digital platforms providers make it easy to give away their information and extremely complicated to limit information sharing.
153 Section 999.315(h)(2) California Consumer Privacy Act Regulations.
154 This privacy dark pattern entails misleading users to pick the wrong option through, for example, the use of double negatives.
155 Section 999.315(h)(5) California Consumer Privacy Act Regulations.
156 This privacy dark pattern entails creating a deliberately long and tedious process to achieve the finest settings or make them so fine and complicated that they will encourage the user to give up before reaching their initial target. See Ibid. (n. 14).
157 Experts predict that “[g]iven California’s size and stature, [the CCPA and the CPRA] will likely have both a catalyzing effect on state and federal responses in the near-term and privacy dark patterns regulations will proliferate. See Jennifer King and Adriana Stephan, “Regulating Privacy Dark Patterns in Practice – Drawing Inspiration from California Privacy Rights Act” (2021) 5 GEO. L. TECH. REV. 250, 253. See also Bradley Song, “Exploring “Dark Patterns” and the California Privacy Rights Act of 2020” (Gordinier Kang & Kim LLP 14 October 2021) accessed 6 March 2022.
158 The Filter Bubble Transparency Act require certain platforms to notify users if their personal data is used to select the content they see using an “opaque algorithm.”
159 S.4626 – “SAFE DATA Act”, 116th Congress (2019-2020) accessed 18 March 2022.
160 MĂĽge Fazlioglu, “Consolidating US privacy legislation: The SAFE DATA Act” (IAPP 21 September 2020) accessed 11 March 2022.
162 Section 102 (a)(2), SAFE DATA Act
163 Section 104 (e) SAFE DATA Act
164 Section 107 (a)(1)(B), SAFE DATA Act.
165 Section 107 (a)(1)(C)(ii), SAFE DATA Act
166 As in most parts of the world, Canada also treats privacy as a fundamental human right. Although not explicitly protected in the Canadian Constitution, Section 2b (on freedom of expression), Section 7 (on right to life, liberty, and security), and Section 8 (on right to be secure against unnecessary search or seizure) of the Charter of Rights and Freedoms – which was made part of the Canadian Constitution in 1982 – have been interpreted as protecting the right to privacy. That said, due partly to Canada’s federal state system, a complex framework of statutes, common law, and torts currently exist to protect different forms of privacy and cover different entities including government institutions, private companies, and individuals.
167 An Act to extend the present laws of Canada that protect the privacy of individuals and that provide individuals with a right of access to personal information about themselves, R.S.C., 1985, c. P-21
168 “Government institution”, according to Section 3 of the Privacy Act means “(a) any department or ministry of state of the Government of Canada, or any body or office, listed in the schedule, and (b) any parent Crown corporation, and any wholly-owned subsidiary of such a corporation, within the meaning of section 83 of the Financial Administration Act;
169 Kris Klein, Canadian Privacy: Data Protection Law and Policy for the Practitioner Fourth Edition, International Association of Privacy Professionals, 2020
170 Personal Information Protection and Electronic Documents Act, SC 2000, c 5.
171 PIPEDA also applies to federally regulated organizations, regardless of whether the provinces they are situated in have a substantially similar privacy legislation. PIPEDA may even apply to non-profit organizations, political organizations etc., if they engage in commercial activities that are not central to their mandate and if those commercial activities involve dealing with personal information. For clarity, PIPEDA does not apply to: personal information handled by federal government organizations governed under the federal Privacy Act; provincial or territorial governments, crown corporations, and other government organizations subject to provincial Freedom of Information and Protection of Privacy Acts; private organizations in provinces with substantially similar laws; and organizations subject to substantially similar health laws. PIPEDA also does not apply to individual’s collection, use or disclosure of personal information solely for personal purposes; an organization’s collection, use or disclosure of personal information solely for journalistic, artistic or literary purposes, or business contact information, that is collected, used or disclosed solely for the purpose of communicating with that person in relation to their employment or profession. It is important to note, however, that sometimes an individual’s personal information can be so closely linked to information about their company that the information about the company can be considered the individual’s personal information
172 See the Alberta’s Personal Information Protection Act (S.A. 2003, c P-6.5), British Columbia’s Personal Information Protection Act (R.S.B.C. 2003, c. 63), and QuĂ©bec’s Act respecting personal information in the private sector (CQLR c P-39.1) respectively.
173 One other law that merits mention here is “An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act”, SC 2010, c 23 [CASL] or Canada’s Anti-Spam Legislation, for short. One of the toughest of its kind, this legislation is a federal-level law introducing a number of changes to PIPEDA, and enacted to guide how commercial electronic messages are sent from Canada, to Canada, and on behalf of someone in Canada. It should also be noted that there are certain sector-specific privacy laws including the Bank Act, which governs how federally regulated financial institutions collect, use, and disclose personal information; provincial credit unions laws, which usually mandate the keeping of members’ transactions confidentiality; and provincial laws regulating credit reporting, which often impose privacy obligations on credit reporting agencies. The existence of these other privacy-related legislation does not automatically preclude the application of PIPEDA.
174 PIPEDA is built on a unique version of fair information principles developed and published by the Canadian Standard Association in 1996. Tagged the “Model Code for the Protection of Personal Information”, the fair information principles were distilled from the privacy principles originally published in 1981 by the Organisation for Economic Co-operation and Development (“OECD”) and comprise accountability, identifying purposes, consent, limiting collection, limiting use, disclosure and retention, accuracy, safeguards, openness, individual access, and challenging compliance. The fair information principles are so important that they must be contained in any provincial laws before they can be considered “substantially similar” to PIPEDA. To be sure, there are other requirement: the provincial laws must also afford similar privacy protection; allow the holding, use, and transfer of personal information only for legitimate purposes; and provide for redress, independent oversight, and power to investigate as in PIPEDA.
175 Michael Geist, “Canada’s GDPR Moment: Why the Consumer Privacy Protection Act is Canada’s Biggest Privacy Overhaul in Decades” (MichaelGeist 17 November 2020) accessed 19 March 2022.
176 Professor Ignacio Cofone, “Policy Proposals for PIPEDA Reform to Address Artificial Intelligence Report” (Office of the Privacy Commissioner of Canada November 2020) accessed 22 March 2022.
177 Office of the Privacy Commissioner of Canada, “Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020” (Office of the Privacy Commissioner of Canada May 2021) accessed 27 March 2022.
178 There are many issues with heavy reliance on the consent basis, chiefs of which are that provisions on consent are based on the rational choice theory of privacy (which theory assumes that individuals rely on rational calculations to make rational choices that result in outcomes aligned with their best interests), thereby ignoring systematic effects of the subjective factors that significantly determine actual privacy choice and fails to consider the privacy paradox, i.e., the curious contradiction in how a person intends to protect their online privacy as against how they actually behave online, or as one author puts it, “[t]he paradox of wanting privacy but behaving as if it didn”t matter (see Helena Vieira, “The paradox of wanting privacy but behaving as if it didn’t matter” (LSE 19 April 2018) accessed 7 March 2022) Also see generally, Richard Warner, “Notice and Choice Must Go: The Collective Control Alternative” (2020) 23 (2) Science and Technology Law Review 173. On the issues with PIPEDA’s heavy reliance on the consent principle, see generally: Lisa M. Austin, “Is Consent the Foundation of Fair Information Practices? Canada’s Experience under Pipeda” (2006) “56 University of Toronto Law Journal 181”
179 See Royal Bank of Canada v. Trang [2016] 2 SCR 412
180 Carlos Jensen, Colin Potts, “Privacy Policies as Decision-Making Tools: An Evaluation of Online Privacy Notices” (2004) 6 CHI 471, 477.
181 See, for example, Professor Lisa Austin, “Who decides? Consent, meaningful choices, and accountability” (Schwartz Reisman Institute for Technology and Society 22 December 2020) accessed 29 March 2022.
182 For reference, see paragraphs 4.2.3, 4.2.5, 4.3.6, 4.5.2, 4.5.3, 4.6.3, 4.7.2, 4.9, and 4.10.2, Schedule 1, PIPEDA.
183 PIPEDA does require regulated entities to identify the purposes for which they are collecting personal information (see generally Paragraph 4); they just do not bear the obligation to specify those purposes to individuals at the time of collecting personal information from them, per the joint reading of Section 5(2) and paragraph 4.2.3 of PIPEDA.
184 The concept was developed in Canada in the 1990s by Ann Cavoukian, then Information and Privacy Commissioner of Ontario.
185 David Krebs, “Canada: Implementing Privacy By Design” (Mondaq 12 November 2018) accessed 17 March 2022.
186 House of Commons, Canada, Report of the Standing Committee on Access to Information, Privacy and Ethics, “Towards Privacy By Design: Review of the Personal Information Protection and Electronic Documents Act” (House of Commons 2018) 50
187 Ibid. 51.
188 Recognizing some of the inherent deficiencies in the consent mechanism under PIPEDA, the Office of the Privacy Commissioner issued revised “Guidelines for obtaining meaningful consent” in 2018 and revised in 2021. See Office of the Privacy Commissioner, “Guidelines for obtaining meaningful consent” (May s2018, revised 13 August 2021) accessed 14 March 2022. While very detailed and useful, the Guidelines do not address privacy dark patterns explicitly and fail to cure inherent weaknesses in the notice-and-choice architecture of PIPEDA, some of which have been identified above.
189 Ibid. (n 178).
190 Section 16, Bill C-11, An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts.
191 See Section 18, Bill C-11
192 Teresa Scassa, “Replacing Canada’s 20-Year-Old Data Protection Law” (23 December 2020) accessed 15 March 2022.
193 For example, Bill C-11 “places too much emphasis on providing organizations flexibility in defining the purposes for which personal information may be used and in obtaining consumer consent”; fails to incorporate privacy-by-design requirements; fails to get rid of implied consent in its consent provisions and places the onus on companies to establish when implied consent is “appropriate”, even though, “[t]here is… nothing in [Section 62]’s transparency requirements that requires specific reporting on the use of implied consent, so it is entirely unclear how reliance upon implied consent will even be detected, let alone enforced” (See Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020)
194 See paragraphs 4.3.5 and 4.4.2 of PIPEDA
195 Ibid. (n 188).
196 See Section 10 (b) of the Alberta Personal Information Protection Act (S.A. 2003, c P-6.5), which states that “[i]f an organization obtains or attempts to obtain consent to the collection, use or disclosure of personal information by… using deceptive or misleading practices, any consent provided or obtained under those circumstances is negated”. [Emphasis added.] See also Section 7 (3)(b) of the British Columbia’s Personal Information Protection Act (R.S.B.C. 2003, c. 63), which use almost exactly the same words as Section 14 of the QuĂ©bec's Act respecting personal information in the private sector (CQLR c P-39.1) that provides that: “Consent to the collection, communication or use of personal information must be manifest, free, and enlightened… Consent given otherwise than in accordance with the first paragraph is without effect.” (1993, c. 17, s. 14; 2006, c. 22, s. 115.) It is interesting to note also that Quebec recently updated many of its privacy laws, including its private-sector legislation cited above through the omnibus Bill 64 (An Act to modernize legislative provisions as regards the protection of personal information), which was assented to on 22 September 2021. Section 110 of Bill 64 requires that consent “must be clear, free and informed and be given for specific purposes. It must be requested for each such purpose, in clear and simple language. If the request for consent is made in writing, it must be presented separately from any other information provided to the person concerned…”, thereby implicitly prohibiting privacy dark patterns.
197 Office of the Privacy Commissioner of Canada, “2020-21 Survey of Canadians on Privacy-Related Issues” accessed 22 March 2022.
198 Ibid.
199 Ibid. (n 178).
200 Evidently, this would entail the insistence on the observation of the fair information principle – thus, among other things, information collection purposes should be clearly identified, explicitly specified, and strictly limited; information collection should be minimized to that which is necessary or legitimate; and strong accountability requirements (including the maintenance of adequate records) should be prescribed.
201 Interestingly, the newly released EU Guidelines on Dark Patterns adopt this approach also.
202 Hunton Andrews Kurth, “Apple and Google Announce Effective Dates of New Mobile App Privacy Requirements” (NatLawReview 8 November 2021) accessed 21 March 2022.
203 Ibid.
204 Kif Leswing, “Apple’s ad privacy change impact shows the power it wields over other industries” (CNBC 13 November 2021) accessed 21 March 2022.
205 Daisuke Wakabayashi, “Google Plans Privacy Changes, but Promises to Not Be Disruptive” (New York Times 16 February 2022) accessed 21 March 2022.
206 Ibid.
207 PIPEDA, CASL, and provincial privacy statutes in Canada place consent at the heart of information protection. To illustrate for examples, under PIPEDA, consent is generally required for the collection of personal information and the subsequent use or disclosure of this information. Under the CASL, companies must not send commercial electronic messages unless “the person to whom the message is sent has consented to receiving it, whether the consent is express or implied…”. (CASL, above note 13, section 6 (1) (a)). Furthermore, unless they are acting under a court order, companies must obtain, in compliance with the CASL, the “express consent of the owner or an authorized user of [a] computer system” before they “install or cause to be installed a computer program on any other person’s computer system…”. See CASL, above note 13, section 8 (1) (a) and CASL, above note 13, section 8 (1). Meanwhile, consent is also central to information collection and use provincial privacy laws. See, for examples, Section 7, Personal Information Protection Act, Statutes of Alberta, 2003 Chapter P-6.5 and Section 29 and Part 3, Ontario Personal Health Information Protection Act, 2004, S.O. 2004, c. 3, Sched. A.