The privacy paradox

  • April 19, 2022
  • Caroline Ross

Introduction

In the online era of commercial activities, a conflict exists between maximizing the utility of e-commerce and networking websites, and protecting the privacy of internet users.1 This tension is seen through the endless choices that consumers make between their privacy and experiencing efficient personalization, usability, and interactivity as they navigate the internet.2 In this way, privacy can be conceptualized as a double-edged sword, because while individuals use privacy controls to protect their personal information, privacy can also become a barrier to innovations within the technology industry.3 Due to this inherent conflict between these interests, a phenomenon known as the “privacy paradox” was born.

The emergence of this paradox is due, in part, to the discrepancy between consumer expectations and the reality of how privacy policies fail to protect their information.4 While consumers are tasked with making trade-offs between the potential gains and losses in sharing their information,5 their individual choices of disclosing information can lead to profiles being built due to data aggregation. This creates a paradox in the fact that while consumers indicate concerns regarding their online privacy, their overall management behaviours do not reflect these fears. Therefore, this fundamental disconnect must be confronted to build a legal framework surrounding privacy in the private sector that mirrors the way in which consumers want their information to be protected.

I argue that to confront the privacy paradox, a normative shift needs to occur in the way that privacy is operationalized in the online realm of the private sector. Therefore, Part I of this paper will contextualize the privacy paradox by exploring two classical theories of privacy, and how these theories feed into the current legal framework of private sector privacy in Canada. Part II will examine the notice-consent model by highlighting its inherent cognitive and structural issues through an assessment of the impact of data mining on consumers. Part III will dive into contemporary theories of privacy while assessing the privacy paradox. This section will also provide a brief overview into the EU’s approach to consumer privacy and the Consumer Privacy Protection Act that the Canadian Federal Government recently proposed. Finally, this paper will attempt to answer the question of how to form a new legal framework that will tackle the problem of the privacy paradox.

Part I: Setting the stage

Classical theories of privacy

Before discussing the legal framework of privacy in the private sector, the question of “what is privacy?” must be examined. The answer to this question can be characterized by using either descriptive or normative measures. A descriptive lens conveys only the degree of privacy that people should enjoy, while a normative notion of privacy investigates why that degree of privacy should be protected.6

The theories of Non-Intrusion and Seclusion conceptualize privacy using a descriptive framework.7 The theory of non-intrusion, advanced by Warren and Brandeis,8 defines privacy as the right to be left alone. The privacy as seclusion theory purports that privacy requires that no one has access to the individual.9 These two theories put forward an all-or-nothing version of privacy; those who disclose information or provide access to others are essentially giving up their privacy entirely.10 Thus, these theories answer the question of “what is privacy?” by asserting that privacy is achieved through a state of total retreat.

While taking on aspects of the theories of non-intrusion and seclusion, the control model is marked by the shift in assessing privacy via a normative approach. Through the control theory, as described by Rachels,11 an individual has privacy if and only if they have control over their personal information. Consent, whether obtained explicitly or implicitly,12 is the driving force behind this theory. Instead of conceptualizing privacy as a personal bubble, this model characterizes it primarily through choices regarding the flow of information.13 Autonomy, which Gerald Dworkin14 defines as “the ability of an individual to think independently to implement their own choice[s]”, allows individuals to exercise their own agency by consenting to the disclosure of personal information.15 Therefore, this model answers the question of “what is privacy?” by stating that the level of privacy an individual is awarded is connected to their ability to control who has access to their information, and that this degree of privacy should be protected because of the element of autonomy.

The next section of this paper will explore how these theories are operationalized within Canada’s jurisprudence of private sector privacy by assessing how the courts interpret the guidelines set out by the legislation governing this area of law.

Current legal framework of privacy in the private sector

Enacted in 2000, the Personal Information Protection and Electronic Documents Act (“PIPEDA”) applies to private-sector organizations across Canada that collect, use, or disclose personal information in the course of commercial activities.16 The intent of this seminal piece of legislation is to promote and balance both privacy and legitimate business interests.17 The 10 Fair Information Principles,18 which make up the backbone of PIPEDA, were inspired by the 1980 Organisation for Economic Co-operation and Development (OEDC) Privacy Guidelines19 and the 1996 Canadian Standards Association (CSA) Model Code for the Protection of Personal Information.20 For the purposes of this paper, the main principle that will be discussed is the normative issue of consent.

The consent principle maintains that organizations are generally required to notify individuals and obtain consent for the collection, use, or disclosure of personal information.21 Since the enforcement of PIPEDA, the Office of the Privacy Commissioner of Canada (“OPC”) has asserted that the concept of “meaningful consent” is an essential element of this principle.22 The Act was subsequently amended to clarify that consent is only valid if it is reasonable to expect that an individual would understand the nature, purpose, and consequences, of the collection, use, or disclosure of the personal information to which they are consenting.23 In determining what constitutes “reasonable expectations,” the Supreme Court of Canada (Supreme Court) in R v Tessling proposed a multifactorial test to determine whether an individual’s subjective expectation of privacy is objectively reasonable.24 In line with this reasoning, the Supreme Court in Royal Bank of Canada v Trang held that the whole context is important when determining the reasonable expectations of the individual, and, as such, the relevancy of the disclosure to the organization receiving the information should be assessed.25 An individual’s reasonable expectations, therefore, depend on the totality of the circumstances.

The definition of “personal information” should be unpacked as this is a vital aspect of the privacy framework. At first glance, it seems as if there is a war within Canadian jurisprudence on what constitutes an individual’s personal information. While PIPEDA broadly defines personal information as any “information about an identifiable individual,”26 the Courts seem to take a narrower approach as to what type of information should be protected.

Canadian Courts view privacy as a quasi-constitutional right since some elements of privacy are entrenched in the Charter of Rights and Freedoms.27 Furthermore, the Supreme Court in Tessling28 held that data related to an individual’s “biographical core,” i.e., information that is highly revealing, sensitive, or meaningful to an individual’s personal life,29 should be safeguarded. This Court also recognized three broad categories of privacy that warranted protection: bodily, territorial, and informational privacy. Since this paper focuses on informational privacy, the standard the Court sets out will be examined. Out of the three categories, informational privacy is generally awarded a weaker standard,30 because this type of data deals with mundane rather than highly sensitive information. The Supreme Court in Trang echoed this sentiment, implied consent was able to fulfill the consent requirement for less sensitive information.31 In the context of informational privacy, it is too narrow of an approach to restrict personal information to only that which falls under the biographical core. As this paper will explore, mass amounts of mundane data are collected online which are then pieced together to profile users. By using this high threshold, Canadian Courts misinterpret how information is collected, used, and disclosed in the online context because it fails to consider the effects of the accumulation and combination of initially small data points.

Examining the way in which the definition of personal information interacts with consent is a vital aspect of constructing a private sector privacy framework. This is because the meaningful consent guideline is the underlying principle that governs the behaviours that consumers undertake to protect their privacy. For the reasons stated above, I have elected to use the definition stated in PIPEDA, i.e., information that may fall outside of the biographical core, when discussing an individual’s personal information throughout the rest of this paper.

Part II: The misalignment between current legal instruments and the reality of online privacy

Notice-consent model and the privacy paradox

The knowledge and consent aspect of PIPEDA’s consent principle is seen through the notice-consent model of online privacy policies. In this model, the notice requirement ensures that internet users are aware of a firm’s data practices, while the consent requirement ensures that those users are only implicated in the data protection practices if they choose to be.32 This model is attractive because it allows for the potential shift in power from businesses who want to collect information to individuals who are awarded the ability to determine who gets access to their information. This is especially important in the online realm since it empowers consumers’ choices as they sift through the myriad of benefits and risks associated with disclosing certain facets of their personal information.33

However, a pertinent issue arises when firms shirk off their duties of maintaining fair information practices by performing the bare minimum of adequately notifying individuals.34 In this scenario, consumers bear the heavy burden of deciding exactly how to navigate online spaces while respecting their individual privacy. The effectiveness of the notice-consent model relies on the assumption that consumers will read and understand these privacy policies to fulfil the objective of making rational decisions.35 The control model, thus, categorizes and assesses all consumers under the assumption that they will act reasonably and rationally, as legal norms usually categorize many individual cases under various homogenous umbrellas.36 Many consumers who would be assumed to be rational, cannot act rationally due to all of the nuances of being an online consumer. The issue with the control model at this point in time is that the moniker of the “reasonable consumer” is disingenuous to the reality that exists in the online world, and this disconnect produces the privacy paradox.

The privacy paradox describes the inconsistency between people’s stated concerns about privacy and their demonstrated or intended disclosure of personal information. While consumers may have the intention to perform privacy management tasks due to their individual attitudes towards online privacy, many consumers do not behave in a way that is conducive to their goals.37 This paradox is guided by both cognitive and structural difficulties associated with the notice-control model’s limitations in allowing consumers to properly manage their privacy.

Cognitive issues

The main concern of the notice-consent model is that it does not offer any real choice to consumers. Privacy policies are generally offered to consumers in an all-or-nothing situation, where the social cost of opting out and not using the platform or website is too high to meaningfully represent a choice.38 Even if consumers are offered meaningful choices, there are a few cognitive barriers that can limit their ability to properly engage in privacy management. The crux of the dilemma is that privacy policies are often presented in a way that is too difficult to understand to correctly perform a cost-benefit analysis. In assessing the flow of information, the purpose of the exchange, as well as the benefits, risks, and harms associated with sharing the information must be considered.39 The privacy paradox can result from the cognitive issues of (1) a skewed calculus towards the benefits of disclosure, (2) an optimistic bias, or (3) learned helplessness and a lack of consumer privacy empowerment.40

Skewed Calculus

In determining whether to disclose information over the internet, consumers may look towards the value of gaining social capital.41 Social capital refers to the positive product of human interaction, as it allows a society or group to function together through mutual relationships.42/sup> Social capital can be gained through “bridging,” i.e. gaining connections with weak ties, or “bonding,” i.e. connections with strong ties.43 This concept is especially important in the online context of social networking sites, as users essentially trade-in or sell off their personal information to create and maintain relationships.44 The use of Facebook or similar sites has become integrated into our daily lives45 because these sites allow people to form bonds with others based on personal interests or shared experiences. The use of these sites can satisfy several needs, such as diversion or entertainment, building social relationships, and identity construction.46 This idea was emphasized in Douez v Facebook, as the Court recognized that accessing social media platforms, and the online communities they are comprised of, has increasingly become important to our everyday lives.47 The choice of staying offline by not accepting the terms and conditions of these platforms is not a real choice that people have in this era. Especially within the context of mandated stay-at-home orders,48 these social networking sites have become akin to a lifeline as they allow people to stay connected with their family and friends in a virtual setting. Due to the importance of these sites, internet users may overemphasize the social capital gains associated with online community building and are thus unwilling to forfeit these benefits to maintain a higher level of privacy. A skewed cost-benefit analysis can occur due to the fact that the actual or perceived benefits of social network participation may outweigh any potential risks of disclosing personal information online.

Optimistic bias

The optimistic bias occurs when consumers have an unrealistic view of the potential privacy risks they could face while navigating the internet.49 While acknowledging that risks can occur, some consumers believe they are at a lower risk of experiencing a negative event (ex. a privacy breach) compared to others. This bias may stem from an overconfidence in their skills and knowledge regarding privacy management, or from incorrect assumptions about what information is protected under privacy policies and to what degree.50 With respect to these incorrect assumptions, one study looked at various privacy policies to assess user perceptions and found that several scenarios which were covered under these policies did not meet consumer privacy expectations.51 Notifying consumers about the privacy policies they interact with is a vital aspect of the notice-consent model, but these notices can do more harm than good if they play a role in creating a disconnect between user perceptions and reality.52 Moreover, corporations may elect to design their policies in a way that furthers this disconnect by intentionally obscuring which third-party organizations receive information from that company.53 This can undermine the privacy policy’s role as a contract dictating the terms that consumers must agree to. The optimistic bias is thus tied to the privacy paradox because it highlights the connection between consumers making a minimal effort to engage in privacy management strategies and the differences between privacy perceptions, expectations, and reality.

Learned helplessness

While some consumers may be too optimistic with regards to their chances of encountering privacy risks, other consumers feel a lack of empowerment in being able to make meaningful choices with regards to their online privacy. These consumers believe that while the collection, usage, and sharing of personal information online is a severe problem to which they are susceptible, they have little confidence in their overall ability to protect their information.54 Empowerment involves the processes and outcomes related to control, critical awareness, and participation. On the other hand, learned helplessness is when an individual believes that they are powerless to prevent negative outcomes or to obtain desired outcomes.55

An example of how learned helplessness can stem from a perceived lack of real choice56 is the updated advertisement policy Samsung released in Canada in late 201957. Samsung’s Smart TV Interest-Based Advertisement program analyzes and uses data that it received over time to provide customers with tailored and interactive advertisements offered by Samsung and their third-party affiliates. From an e-commerce perspective, the ability to collect user data is extremely valuable, and companies often suggest that these benefits pass on to customers in the form of convenience through targeted advertising.58 Convenience for consumers is a factor of time and effort, which means that services are convenient if offered with effective time utilization, portability, and minimal user input. Consequently, Samsung justifies the use of their advertisement system because it allows users to efficiently browse through entertainment applications by simply clicking on the recommended content rather than through aimless searching.

Nevertheless, from a consumer standpoint, these advertisements may exacerbate the feeling of being manipulated by corporations.59 The terms and conditions that must be accepted to use the Samsung Smart TV state that the service may be supported by advertisements at any time. This means that while consumers have the option of opting out of the interest-based advertisements, this choice may not prevent the delivery of generic advertisements to their Smart TV. The policy also stipulates that these advertisements will be broadcasted across various Samsung and third-party platforms, as well as other advertising-capable devices associated with the Samsung Smart TV.60 Even if consumers wanted to see interest-based advertisements on their television but not on their associated devices, consumers must partake in an all-or-nothing decision-making process.

The prospect of receiving tailored information was supposed to strengthen a consumer’s ability to make individual choices,61 but it seems the opposite is true. Consumers may be willing to tolerate some contextually targeted advertisements more than non-targeted advertisements because they have the potential to provide relevant information to the user.62 However, making these advertisements obtrusive in nature or streamed across multiple devices may increase perceptions of manipulation. When consumers perceive that they are powerless to prevent privacy threats due to feeling manipulated, they will display passivity in protecting their privacy and thus divulge information despite their worries.63 With respect to interest-based advertisement programs, some consumers may not change their privacy settings, even if they value security, because these consumers perceive their choices will not make a substantial difference. The concept of learned helplessness is associated to the privacy paradox as it helps illustrate why some consumers may appear apathetic with regards to their online privacy, even though they indicate the intention to take part in privacy management techniques.

Structural issues

The cognitive issues present in the cost-benefit analysis of disclosure in the notice-consent model are linked to a few structural obstacles inherent to privacy management in the era of “big data.”

Problem of scaling

The first structural issue associated with the privacy paradox is the fact that the notice-consent model does not scale properly.64 There is a risk of consumers developing decision fatigue because internet users visit hundreds of websites daily.65 Consumers are given the impossible task of sifting through the privacy notice for every website they encounter to make informed and meaningful decisions. The multitude of disclosures generates an information overload, which overwhelms consumers and undermines the consent mechanism embedded in the notice-consent model.66 Due to this attention overload, consumers have the tendency to skim over the privacy notices or to ignore them altogether.67 This overload compounds the effect of learned helplessness and perceptions of manipulation as consumers feel there is no real choice in how their information is collected over the internet.68 This can be seen by assessing cookie banners,69 as some websites have an obstructive banner that allows users to manage their cookie settings, while other platforms simply state that a user agrees to the privacy policy by continuing to use the site. This setup highlights the illusion of choice granted to consumers by the notice-consent framework.

Problem of data aggregation

Another key structural issue is the concept of aggregation, which is a vital aspect of utilizing “big data.” Aggregation, however, may limit a consumer’s ability to accurately weigh the costs and benefits of consenting to the disclosure personal information.70 Internet users provide innocuous or mundane pieces of information to various sources, but consumers are generally not aware when this data is aggregated and if the aggregated data reveals sensitive information.71 Data mining tools are used to discover patterns in data that may appear nonobvious at first but can lead individuals to become identified or associated within arbitrary groups.72 Thus, the option of opting in or out is becoming irrelevant as to whether personal data can be accumulated, because while a user may choose which sites access what information, this information can still be aggregated by third parties to create a complete profile that becomes available to numerous corporations.73 Additionally, the notice aspect of the model ultimately fails to capture the full effect of aggregation because companies may find it impossible to know in advance of what they may discover through data mining techniques.74

This issue of aggregation is prevalent in the use of algorithms throughout social media networking platforms. In the era of “big data,”75 companies use algorithms to create user profiles and then use these profiles to silently seduce individuals to participate in ways that are advantageous to that corporation or its third-party affiliates.76 These algorithms force consumers into a closed filter bubbles since consumers’ interests are captured, and then the algorithm chooses what content they receive, which in turn reinforces their initial opinions.77 The filter bubbles become a digital prison, as the algorithm keeps individuals away from alternative products or viewpoints. The concept of filter bubbles is evident in the Cambridge Analytica scandal, in which the company was able to access millions of Facebook profiles without the users’ explicit consent for the use of psychographic modelling for political purposes.78 In their investigation, the OPC found that Facebook did not demonstrate that it had obtained meaningful consent from the users who installed the third-party applications for the disclosure of their personal information, nor did it make a reasonable effort to ensure users had sufficient knowledge to provide meaningful consent for disclosures to third-party applications more generally.79 Corporations may try to convince policy makers that consumers want convenience over anything else, so the use of algorithms to target specific users is in their best interest.80 While consumers may accumulate short-term benefits through easy access to curated content, ultimately these techniques are used to persuade consumers to act in the corporation’s best interest. Therefore, when assessing the effects of using of algorithms and aggregating data, it becomes apparent that privacy harms are cumulative in nature.

Putting it together: Psychological distance to the cumulative nature of privacy harms

Consumers often face a roadblock in assessing the benefits and harms associated with informational disclosure. This is due to the compounding effect of data aggregation and the problem of evaluating the short-sighted benefits more heavily than the potential harms of how their data might be used in the future. Privacy self-management techniques fail to account for the cumulative impacts of individual privacy decisions because these decisions are often made in isolated situations.81 The notice-consent model also failed consumers in the Cambridge Analytica situation because it allowed the choices made by other users to impact what information was gathered. Specifically, the OPC found that Facebook allowed third-party applications to obtain personal information about the friends of those who installed those applications.82 Even if the average consumer was able to perform a rational cost-benefit calculation without skewing the benefits or risks, privacy policies that allow these types of situations undermine the process of obtaining meaningful consent. It is unreasonable to expect that consumers will appreciate the potential effects of different forms of information gathering practices,83 especially with respect to the endless ways that third parties can gain access their data or the future uses of the information they disclose.

Psychological distance measures how concrete or abstract an idea is to an individual. A low-level distance indicates an individual is more likely to have an increased level of perceived risk, while a further distance highlights the benefits of the choice.84 With regards to informational privacy, consumers cannot reasonably protect their information as the effects of data mining and use of algorithms go beyond their privacy management skills.85 Additionally, future incidents are often discounted,86 which is why privacy harms may not seem immediate to consumers. Some consumers may have fewer personal experiences with privacy harms or are too optimistic about their perceived chance of risk, and thus diligently managing their privacy is not a main concern. This high level of psychological distance that consumers experience in relation to privacy indicates that the notice-consent model ultimately neglect to protect consumer privacy interests.

In short, the problem with the consent model is that it resembles a “choose-your-own-adventure” game, but realistically your choices do not matter. The notice-choice model essentially puts form above substance, as consent amounts to nothing more than an illusion of power granted to consumers. This illusion leads to consumers experiencing a psychological distance from privacy risks which exacerbates the privacy paradox. By assessing the nuances of the privacy paradox, it becomes evident that maybe the privacy paradox is not a paradox after all. Consumers act according to the mistaken belief that their personal information is protected at a higher level than the average privacy policy protects. The privacy paradox, therefore, does not illustrate a disconnect between consumer concerns and their management behaviours, but rather the discrepancy between the protection that consumers think they have and the actual protection they are awarded. The structural obstacles of scaling, data aggregation, and the various tactics of data mining through cookies and algorithms all have a hand in furthering this discrepancy.

Part III: Bridging the gap that created the privacy paradox

Contemporary theories

Due to the criticisms of the notice-consent model, the umbrella category of the social theory of privacy was formed. Theories under this umbrella are normative notions that connect privacy to the flow of information within certain contexts. Privacy as contextual integrity assesses whether the flow of information is appropriate by looking at the expectations within specific communities which disclose and receive the information. Nissenbaum formed this theory by determining that privacy guarantees the boundaries that help maintain variety of social environments.87 In this theory, protecting privacy is about the possibility for an individual to properly embed themselves in a multitude of social relationships.88 Building off of the contextual integrity theory, privacy as a social contract indicates that there is a mutually beneficial agreement within a community about how information is shared and used.89 Individuals can differentiate between relationships by discriminately sharing information within their various social circles. Through this process, the social contract theory allows for the fact that individuals can disclose their personal information to some groups without relinquishing their overall privacy interests. The theories under this umbrella maintain that privacy is a relationship between the varying situations, contexts, or norms in particular contexts, and the ability for individuals to participate in numerous types of social relationships.

The current landscape of the social theory of privacy fails to capture the ways that social networking platforms erode the boundaries between types of social relationships. Through the growth of the diverse set of social circles that comprise social networking sites,90 it is harder for users to manage their privacy in different contexts and relationships. Users of these sites are more concerned about how they manage the boundaries between their multiple social circles rather than maintaining the boundary between their “friends list” and the general public.91 While the social theory of privacy illuminates the nuances of why the notice-control model is inadequate, it still fails to capture the realty of the online landscape. In future theoretical developments, it may be helpful to view the online landscape as simultaneously “one” and many contexts when gauging consumer privacy management techniques.

Recent and current developments

The European Union (“EU”) enacted the General Data Protection Regulation (“GDPR”)92 in 2018, which sets a high normative standard for information collection, use, and disclosure for any website/corporation that does business with the EU. The GDPR was formulated using the “privacy by default and design” policy, which mandates that, by default, only personal data, which is necessary for specific purposes are processed, and that appropriate safeguard measures are put in place.93 This regulation also provides stricter rules on obtaining meaningful consent as consent must be “freely given, specific, informed and unambiguous.”94 While this regulation still emphasises the notice-consent model, it alleviates some of the burden that consumers may face by ensuring that corporations make their data usage practices clear and concise.

Inspired by the GDPR, the Canadian Consumer Privacy Protection Act (“Bill C-11”)95 was proposed in November 2020 to create new data privacy obligations and new enforcement mechanisms. If it becomes law, this Act will address algorithm transparency, alter the consent guidelines, and create new order-making powers for the OPC. To facilitate economic transactions, this bill will allow implied consent to be used for certain legitimate business purposes; but this exception will not apply when the personal information is collected or used to influence an individual’s behaviour or decisions. A concern with this system is that it essentially makes the notice-consent model into a notice-only model, and as research has shown, the average consumer does not read or understand these privacy notices. Another concern is that, unlike the GDPR, there is no current mention of a privacy-by-design underlying policy. An advantage to this proposed system is that individuals are awarded a right of action for privacy harms, and the OPC will be able to order compliance or fine companies rather than only expressing either appreciation or disappointment towards a corporation’s measures to address privacy harms. Only time will tell if Bill C-11 will represent a new hope for online consumer privacy, or if it will amount to the same framework that created the privacy paradox repackaged under a different name.

Suggestions moving forward

It is apparent that the notice-consent model has reached its limit as algorithms become common in our way of life.96 The notice-consent model seems more like a notice-waiver model as it is now primarily being used to shield corporations from liability when consumer information is used in ways they did not contemplate when initially consenting to the disclosure. A personalized privacy protection program would allow privacy regulation to be balanced in using a more paternalistic approach to protecting consumer privacy while ensuring that consumers do not feel as if their autonomy has been taken away.97 In doing so, privacy regulation could create a baseline level of privacy protection that allows different tiers of choices to be made with regards to how information is disclosed.98 The baseline level of privacy would have to ensure that consumer data is secure regardless of whether the consumers undertake measures to further protect their privacy. This is akin to how food cannot be sold unless it passes the safety requirements set out by the Canadian Food Inspection Agency. Once food can be sold, consumers can choose between various tiers such as whether to purchase organic or non-organic items. The nutritional contents of food are simply laid out using a label, which helps consumers make meaningful choices in their purchases. This analogy shows that it is not a new concept to design a system of a successfully enforced set of standards which does not strip away the autonomy of its users.99

When applied to online privacy, this type of system will enable some consumers to sell their personal information or exchange it for services, and on the other hand, still allowing others to protect their information by not having it collected against their will.100 A regulatory privacy program could use algorithms to indicate the quality of the personalized privacy protection plan.101 Instead of simply fighting fire with fire or trying to eliminate algorithms entirely, the privacy by design concepts of data minimization and transparency102 would allow for a streamlined process in which regulatory bodies can assess how personal information is captured and used across the internet. Coupled with the new enforcement mechanisms in Bill-11, it may be feasible that big data can be leveraged in a way that protects consumer privacy. Perhaps, personalizing privacy protection could alleviate the paradoxical issues associated with the current “one size fits all” protection method.

Conclusion

In the online era, internet users transformed from being the consumers of the internet, to being the product sold. Corporations collect and aggregate personal information from consumers, build profiles from that information, and then disseminate it to their third-party affiliates for revenue-generating purposes. While the control theory was useful in developing the meaningful consent principle embedded within the current legal framework of private sector privacy in Canada, it is time to think past this conception. It is unfortunate, but this model fails to capture the reality of the online landscape and has thus led to the formation of the privacy paradox phenomenon. The social theory of privacy is a good step in correctly conceptualizing consumer privacy interests, but it ultimately falls short in capturing the nuances of online social relationships.

Because of the failures of the notice-consent model, I argue that a normative shift is required in how consumer privacy is operationalized in the online context. With all the technological advancements that have occurred since the enactment of PIPEDA, it may be time to rely on other guidelines besides meaningful consent because as the saying goes “old keys will not open new doors.” I argue that the framework proposed by Bill C-11 takes two steps forward but one step backwards when it comes to substantially protecting consumer privacy. While the requirements of transparency will provide more meaningful data management strategies for consumers, the lack of a data minimization strategy may only work to disillusion consumers further. As the use of technology becomes embedded into our daily lives, it could be useful to allow for a symbiotic relationship between big data, algorithms, and personalized privacy protection policies.


Caroline Ross is a third-year law student at Queen’s University Faculty of Law. She received an Honours Bachelor of Science from the University of Toronto while completing a double major in biology for health sciences and sociology. Since the beginning of her JD, she has been working with the Conflict Analytics Lab to build predictive settlement tools for law firms and self-represented litigants.

End Notes

1 Maor Weinberger, Dan Bouhnik & Maayan Zhitomirsky-Geffet, “Factors Affecting Students’ Privacy Paradox

and Privacy Protection Behavior” (2017) 1:1 Open Information Science 3 at 5.

2 Ibid at 3.

3 Wei Zhou & Selwyn Piramuthu, “Information Relevance Model of Customized Privacy for IoT” (2015) 131:1 J Bus Ethics 19 at 19.

4 Spyros Kokalakis, “Privacy attitudes and privacy behaviour: A review of current research on the privacy paradox phenomenon” (2017) 64 Computers & Security 122 at 123.

5 Ruwan Bandaraa, Mario Fernandoa & Shahriar Akterb, “Explicating the privacy paradox: A qualitative inquiry of online shopping consumers” (2020) 52 J Retailing & Consumer Services 1 at 2.

6 Marcel Becker, “Privacy in the digital age: comparing and contrasting individual versus social approaches towards privacy” (2019) 21 Ethics & Information Technology 307 at 307.

7 Herman T Tavani, “Philosophical theories of privacy: Implications for an adequate online privacy policy” (2007) 38:1 Metaphilosophy 5 at 5.

8 Samuel D Warren & Louis D Brandeis, “The Right to Privacy” (1890) 4:5 Harv L Rev 193 at 195.

9 Tavani, supra note 7 at 6.

10 Kirsten Martin, “Understanding Privacy Online: Development of a Social Contract Approach to Privacy” (2016) 137:3 J Bus Ethics 551 at 552 [Martin, “Social Contract Approach to Privacy”].

11 James Rachels, “Why Privacy Is Important” (1975) 4:4 Philosophy & Public Affairs 323 at 329.

12 Tavani, supra note 7 at 7.

13 Ibid.

14 Luiza Jarovsky, "Improving Consent in Information Privacy through Autonomy-Preserving Protective Measures (APPMs)" (2018) 4:4 European Data Protection L Rev 447 at 451.

15 Ibid at 452.

16 Commercial activities are defined as any “any “transaction, act or conduct or any regular course of conduct that is of a commercial character”. See Personal Information Protection and Electronic Documents Act, SC 2000, c 5, s 2(1) [PIPEDA].

17 Lisa M Austin, “Reviewing PIPEDA: Control, Privacy and the Limits of Fair Information Practices” (2006) 44 Can Bus L J 21 at 38 [Austin, “Reviewing PIPEDA”].

18 The 10 Fair Information Principles under PIPEDA are: (1) Accountability, (2) Identifying Purposes, (3) Consent, (4) Limiting Collection, (5) Limiting Use, Disclosure, and Retention, (6) Accuracy, (7) Safeguards, (8) Openness, (9) Individual Access, and (10) Challenging Compliance. See PIPEDA, supra note 16 at Schedule I (section 5).

19 Lisa M Austin, “Is Consent the Foundation of Fair Information Practices? Canada's Experience under Pipeda” (2006) 56:2 U Toronto L J 181 at 194 [Austin, “Is Consent the Foundation of Fair Information Practices?”].

20 Ibid at 196-97.

21 PIPEDA, supra note 16, Schedule I 4.3.

22 Office of the Privacy Commissioner of Canada, “Guidelines for obtaining meaningful consent” (24 May 2018), online.

23 PIPEDA, supra note 16, s 6.1.

24 R v Tessling, 2004 SCC 67 at paras 31-32 [Tessling].

25 Royal Bank of Canada v Trang, 2016 SCC 50 at para 44 [Trang].

26 PIPEDA, supra note 16 at s 6.1.

27 Canadian Charter of Rights and Freedoms, ss 2(a), 7- 8, Part I of the Constitution Act, 1982, being Schedule B to

the Canada Act 1982 (UK), 1982, c 11.

28 Tessling, supra note 24 at para 25.

29 R v Plant [1993] 3 SCR 281 at para 293.

30 Valerie Steeves, “If the Supreme Court Were on Facebook: Evaluating the Reasonable Expectation of Privacy Test from a Social Perspective” (2008) 50:3 Can J Corr 331 at 335.

31 Trang, supra note 25 at para 49.

32 Daniel Susser, “Notice After Notice-and-Consent: Why Privacy Disclosures Are Valuable Even If Consent Frameworks Aren't” (2019) 9 J Information Policy 148 at 153.

33 Sophie C Boerman, Sanne Kruikemeier & Frederik J Zuiderveen Borgesius, “Exploring Motivations for Online Privacy Protection Behavior: Insights from Panel Data” (2018) Communication Research 1 at 2.

34 Martin, “Social Contract Approach to Privacy”, supra note 10 at 552.

35 Kirsten Martin , “Privacy Notices as Tabula Rasa: An Empirical Investigation into How Complying with a Privacy Notice Is Related to Meeting Privacy Expectations Online” (2015) 34:2 J Public Policy 210 at 212 [Martin, “Privacy Notice Is Related to Meeting Privacy Expectations Online].

36 Christoph Busch, “Implementing Personalized Law” (2019) 86:2 U Chicago L Rev 309 at 313.

37 Kokalakis, supra note 4 at 123-24.

38 Susser, supra note 32 at 157.

39 Martin, “Privacy Notice Is Related to Meeting Privacy Expectations Online”, supra note 35 at 214.

40 Bandaraa, Fernandoa & Akterb, supra note 5 at 2.

41 Weinberger, Bouhnik & Zhitomirsky-Geffet, supra note 1 at 5.

42 Hadas Schwartz-Chassidim et al, “Selectivity in posting on social networks: the role of privacy concerns, social capital, and technical literacy” (2020) 6 Heliyon 1 at 4.

43 Ibid.

44 Weinberger, Bouhnik & Zhitomirsky-Geffet, supra note 1 at 6.

45 In 2020, it was reported that around 27 million Canadians are Facebook users. See NapoleonCat.

46 Kokalakis, supra note 4 at 131.

47 Douez v Facebook, 2017 SCC 33 at para 56 [Douez].

48 This article was written during the 2020 COVID-19 pandemic.

49 Weinberger, Bouhnik & Zhitomirsky-Geffet, supra note 1 at 4.

50 Daniel J Solove, “Introduction: Privacy Self-Management and The Consent Dilemma” (2013) 126:7 Harv L Rev 1880 at 1884.

51 Martin, “Privacy Notice Is Related to Meeting Privacy Expectations Online”, supra note 35 at 220.

52 Susser, supra note 32 at 165.

53 Martin, “Privacy Notice Is Related to Meeting Privacy Expectations Online”, supra note 35 at 219.

54 Boerman, Kruikemeier & Zuiderveen Borgesius, supra note 33 at 16.

55 Bandaraa, Fernandoa & Akterb, supra note 5 at 3.

56 Jarovsky, supra note 14 at 450.

57 Samsung, “Samsung TV - Interest-Based Advertisements Service” (last modified 1 October 2020), Online: Samsung.

58 David Dubrovsky, “Protecting Online Privacy in the Private Sector: Is there a ‘Better’ Model?” (2005) 18:2 RQDI 171 at 174.

59 Julien Cloarec, “The personalization–privacy paradox in the attention economy” (2020) 161 Technological Forecasting & Social Change 1 at 4.

60 Associated devices include websites, streaming video platforms, mobile apps, smartphones, tablets, and other advertising-capable devices. See Samsung note 58.

61 Becker, supra note 6 at 310.

62 Avi Goldfarb & Catherine Tucker, “Online Display Advertising: Targeting and Obtrusiveness” (2011) 30:3 Online Display Advertising 389 at 400.

63 Bandaraa, Fernandoa & Akterb, supra note 5 at 3.

64 Solove, supra note 51 at 1888.

65 Busch, supra note 36 at 322.

66 M Ryan Calo, “Against Notice Skepticism in Privacy (And Elsewhere)” (2012) 87:3 Notre Dame L Rev 1027 at 1054.

67 Cloarec. supra note 60 at 4.

68 Jarovsky, supra note 14 at 449.

69 Cookies are files that track user information when consumers visit websites. See Office of the Privacy Commissioner of Canada, “Web tracking with cookies” (6 May 2011), online.

70 Solove, supra note 51 at 1889-90.

71 Kokalakis, supra note 4 at 131.

72 Tavani, supra note 7 at 13.

73 Ron McLay, “Managing the rise of Artificial Intelligence” (2018) at 33, online (pdf): Human Rights & Technology.

74 Susser, supra note 32 at 157.

75 “Big data” is defined as “extremely large and complex data sets that are so voluminous that traditional data processing software cannot manage them” See, McLay, supra note 74 at 28.

76 Becker, supra note 6 at 309.

77 McLay, supra note 74 at 26.

78 Canada, Office of the Privacy Commissioner of Canada, Joint investigation of Facebook, Inc. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia (OPC, 25 April 2019) at para 1 [PIPEDA #2019-002].

79 Ibid at para 85.

80 Nathalie Maréchal, Rebecca MacKinnon & Jessica Dheere, “Getting to the Source of Infodemics: It’s the Business Model” at 34, online (pdf): New America: Open Technology Institute.

81 Solove, supra note 51 at 1891-92.

82 PIPEDA #2019-002, supra note 78 at para 114.

83 Austin, “Is Consent the Foundation of Fair Information Practices?”, supra note 19 at 191.

84 “Psychological distance” is a main component of the Construal Theory of Social Psychology. See Bandaraa, Fernandoa & Akterb, supra note 5 at 5.

85 Dubrovsky, supra note 59 at 177.

86 Kokalakis, supra note 4 at 131.

87 Helen Nissenbaum, "Privacy as Contextual Integrity" (2004) 79:1 Wash L Rev 119.

88 Becker, supra note 6 at 311.

89 Martin, “Social Contract Approach to Privacy”, supra note 10 at 557.

90 Schwartz-Chassidim, supra note 43 at 2.

91 Ibid at 3.

92 General Data Protection Regulation, (EU) 2016/679.

93 Ibid at Article 25.

94 See European Union “Data protection in the EU”, online European Commission for more information on the GDPR.

95 Bill C-11, An Act to enact the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act and to make consequential and related amendments to other Acts, 2nd Sess, 43rd Parl, 2020, cl 69 (first reading, 17 November 2020).

96 Busch, supra note 36 at 330.

97 Solove, supra note 51 at 1894.

98 Zhou & Piramuthu, supra note 3 at 26.

99 Maréchal, MacKinnon & Dheere, supra note 81 at 31.

100 Susser, supra note 32 at 153.

101 Busch, supra note 36 at 326.

102 Information and Privacy Commissioner of Ontario, Privacy by Design: The 7 Foundational Principles, 2011, by Ann Cavoukian.