Skip to main content

Children’s Privacy Code – Exploratory Consultation

Preface

The Canadian Bar Association is a national association representing 40,000 jurists, including lawyers, notaries, law teachers and students across Canada. The Association's primary objectives include improvement in the law and in the administration of justice.

This submission was prepared by the CBA Privacy and Child and Youth Sections, with assistance from the Advocacy Department at the CBA office. The submission has been reviewed by the Policy Committee and approved as a public statement of the CBA Privacy and Child and Youth Sections. 

I. Introduction

The CBA Sections commend the Office of the Privacy Commissioner of Canada (OPC) for its initiative in developing guidance for private-sector organizations under the Personal Information Protection and Electronic Documents Act (PIPEDA)1. Given the current legislative constraints within PIPEDA, such guidance represents an important starting point in supporting organizations’ compliance efforts and enhancing the protection of personal information within.

II. Application of a children’s privacy code

A. Scope of Application

The CBA Sections submit that Canada requires a modern, enforceable legislative framework for children’s privacy that is consistent across jurisdictions and sectors, and that goes beyond PIPEDA guidance, aligns with international standards such as the United Nations Convention on the Rights of the Child (UNCRC)1, the UK Age Appropriate Design Code2, and European Union (EU) guidelines, applies consistently across all sectors and jurisdictions, and ensures coordinated implementation through cross‑government, industry, and human rights oversight mechanisms to fully realize children’s rights in the digital environment.

In developing its initial guidance, pending more formal legislative amendments empowering enforcement action, the OPC should consider the following details relating to the scope of application:

  1. Advocate for legislative framework for the protection of children’s privacy in Canada. Bills C-27 (the Digital Charter Implementation Act) and Bill C-63 (the Online Harms Act) died on the Order Paper with the prorogation of Parliament in January 2025. It appears no new legislation will be reintroduced. The legislative framework must go beyond addressing online harms to the broader scope of age-appropriate design and intentionally designing services with children in mind. For example:
    1. Article 16 of the UNCRC, which sets out the privacy rights of children globally, must be interpreted and applied alongside Article 17. Article 17 sets out children’s information rights, and particularly their right to access information, “aimed at the promotion of [the child’s] social, spiritual and moral wellbeing and physical and mental health.” It further directs State Parties to the Convention to “encourage the development of appropriate guidelines for the protection of the child from information and material injurious to his or her well-being, bearing in mind the provisions of articles 13 and 18.”
    2. The UK’s Age-Appropriate Design Code, also known as the Children’s Code, is a statutory code of practice created under Section 123 of the Data Protection Act 20183, as amended to implement UK General Data Protection Regulation (GDPR)4 provisions for children’s data privacy. It is widely heralded as a global best practice. The protections for children’s privacy must meet or exceed those required by the UK standard and must be supported by legislation that allows for robust and effective enforcement of these laws.
    3. A further relevant benchmark is the recently published guidelines of the European Commission relating to the protection of minors under the Digital Services Acti5in addition to benefiting children’s well-being, aligning Canadian industry standards for children’s privacy with Europe’s could be beneficial to encourage trade and export opportunities for Canadian developers and service providers.
  2. The measures proposed for age-appropriate design to protect and promote children’s privacy interests should apply in both public and private sectors, and in all Canadian jurisdictions, unless a substantially similar code is in place in a province. In such cases, coordination with that province is required so the coverage is consistent across Canada.
  3. OPC guidance under PIPEDA cannot adequately achieve the goal of consistency across Canada or meet Canada’s enforcement obligations under the UNCRC. Specific legislative provisions within new or existing laws concerning children’s privacy are the appropriate vehicle to begin this reform. The OPC’s welcome efforts to provide guidance within the current PIPEDA framework does not supplant the need for more thorough ongoing law reform.
  4. The CBA Sections submit that robust implementation of children’s privacy rights could provide a powerful lens to protect children’s privacy, and through that lens, their right to solitude, their right to a private and supportive family life, the protection of their reputation, their control over the use of their image and likeness. The European Court of Human Rights (ECtHR), Article 8 jurisprudence, is an example of such a lens in actions. Canadian Charter jurisprudence has not sufficiently developed a jurisprudence on children’s rights to date. Children’s privacy rights must be interpreted and applied in keeping with the general principles of children’s rights and all their other rights guaranteed under the UNCRC Convention. Meaningful implementation of children’s privacy rights in Canada requires not only dedicated efforts by industry and oversight bodies such as the OPC in relation to PIPEDA, it requires a cross-sectoral approach and integrative collaboration with criminal justice systems, health systems, educational services and child protection services – to name a few – as well as general human rights oversight mechanisms to respect the intersectional interdependence of children’s privacy interests with all of their other human rights. Structural change and formal mechanisms for joint implementation of children’s rights in relation to the digital environment are required for full implementation of the rights of the child as set out in General Comment 25.6 “The OPC’s leadership in providing PIPEDA guidance is important and welcome. However, to effectively champion the interests of Canadian children, the OPC must also collaborate across government departments, with industry, and among different levels of government.
  5. The OPC’s guidance within the current PIPEDA framework should also be developed in conjunction with guidance in relation to public sector service offerings impacting children under the Privacy Act.7 Moreover, detailed guidance should be available as to how health technologies and information management practices are developed to protect children’s privacy interests in the health sector, both in its private and public dimensions.

Should a children’s privacy code apply differently to sites exclusively directed at children and those directed at a broad audience that includes children?

  1. Age-appropriate design guidance must apply to services and products designed specifically for children as well as to services that children may be reasonably foreseen to access or where they are the subject of information being disseminated. The question should not be whether the intended audience contains a “significant number of children” but rather whether children’s use and access of the services is reasonably foreseeable. Concerning the second issue, age verification and age assurance standards in the UK and in Europe are moving towards regulation based on whether children are likely to access the content which addresses when children are users, but there is also a need to address situations where children are not the users but are the subject of the information being disseminated as that also impacts their right to privacy. For example, an app used by a daycare provider to communicate with parents of young children, while not accessed by a child itself, may contain and disseminate significant amounts of information about the child. For adult-only services the onus must be clearly placed on the service provider to monitor age-based access and ensure that the services are not available to children, and that the content being made available does not violate the rights of children (such as, by way of example, child sexual abuse material). The UK’s recent guidance to the industry in relation to highly effective age verification practices for adult sites is a helpful reference for Canadian policy and guidance around children’s access to online services.
  2. Drawing from Dr. Ignacio Cofone’s work, OPC guidance should anticipate that, moving forward, the regulation of civil liability for harms to children from online play, work, learning and social environments needs to be approached on both tort and contractual law bases (The Privacy Fallacy). Moreover, issues such as intellectual property rights of children (to their image and likeness, to the content they create) also must be addressed in a way that allows them to realize their creative goals and enforce their rights. While some of this may be accomplished via privacy related guidance and related legislative amendments, other areas of law will also need to be engaged.
  3. A children’s privacy code should not apply differently to sites exclusively directed at children and those directed at a broad audience that includes children. First, websites are rarely exclusively directed at children, and so the distinction will find little application. For example, while Kids Help Phone provides national resources for youth, it also offers content to support parents. Kids Help Phone, though directed at children, is not exclusive to them. The baseline application would introduce unnecessary complexity.
  4. Depending on the definition of “exclusive” and the additional obligations that arise, if the definition is too strict, other sites may introduce content targeted at adults to exempt themselves from additional obligations. Instead of encouraging adherence to norms, companies may be motivated to escape them. The alternative – an expansive definition of “exclusive” – becomes functionally equivalent to a likelihood-of-being-accessed-by-children test.
  5. International trends are moving away from a two-tiered approach. In the United States, the Children’s Online Privacy Protection Act (COPPA)8 historically applied to websites directed at children under 13 or services with actual knowledge that a user is under 13. This meant many general sites avoided COPPA unless they knowingly collected data from young children. However, new U.S. proposals and state laws are broadening the scope. For example, California’s Age-Appropriate Design Code (CAADC)9 (effective 2024) covers any online service “likely to be accessed” by minors (under 18) – including general audience sites – not just services overtly aimed at children. In Europe, the approach is similarly broad. The UK’s Age-Appropriate Design applies to all digital services likely to be accessed by users under 18, even if the primary audience is adults.

Which factors should be considered when determining the likelihood of children accessing a service?

  1. The UK Information Commissioner’s Office (ICO) has enumerated, in their FAQ, several factors that are instructive. These include, among other factors: the nature and content of the service; whether that has particular appeal for children; the number of children users of a service; whether advertisements on a service, including third party advertisements, are directed at or likely to appeal to children; complaints received about children accessing a service; and, conversely, any measures put in place to prevent children gaining access to content.
  2. As technology and business practices continue to evolve, so will the quantity and quality of available evidence. Any evaluation of children’s likelihood of accessing a service should follow a common-sense approach.

How can this assessment be done in a privacy-protective manner?

  1. In its October 2024 submission on Privacy and Age Assurance – Exploratory Consultation10, the CBA Privacy and Access Law Section emphasized a risk-based approach. This approach emphasized that any requirements around privacy should be proportional to the level of risk posed to children based on the kind and quantity of information being collected about them. A similar rationale can apply to the requirements imposed on website developers: more intrusive age-verification may only be required if there is a substantial risk to children accessing a website’s content.

Should a children’s privacy code only apply when certain risks or harms are possible due to access to or use of the site – and if so, which ones?

  1. A children’s privacy code should always apply, but using a proportional, risk-based approach. This would align with international trends, Canada’s historical approach, and the complexity of how often children access certain websites, and how that information impacts them.

When considering risks and harms, again, the UK’s recent guidance to the industry in relation to highly effective age verification practices for adult sites is a helpful reference for Canadian policy and guidance in the area of children’s access to online services (Quick guide to Protection of Children Codes).11 In addition to these harms, we recommend the OPC consider the risk of discrimination against certain vulnerable intersections, including disabled, Indigenous, and 2SLGBTQ+ children. To fully understand these harms, intentional efforts should be made to seek comments from these communities.

III. Enabling the exercise of children’s privacy rights

A. Scope of Application

The CBA Sections submit that Canada requires a modern, enforceable legislative framework for children’s privacy that is consistent across jurisdictions and sectors, and that goes beyond PIPEDA guidance, aligns with international standards such as the United Nations Convention on the Rights of the Child (UNCRC)1, the UK Age Appropriate Design Code2, and European Union (EU) guidelines, applies consistently across all sectors and jurisdictions, and ensures coordinated implementation through cross‑government, industry, and human rights oversight mechanisms to fully realize children’s rights in the digital environment.

In developing its initial guidance, pending more formal legislative amendments empowering enforcement action, the OPC should consider the following details relating to the scope of application:

  1. Advocate for legislative framework for the protection of children’s privacy in Canada. Bills C-27 (the Digital Charter Implementation Act) and Bill C-63 (the Online Harms Act) died on the Order Paper with the prorogation of Parliament in January 2025. It appears no new legislation will be reintroduced. The legislative framework must go beyond addressing online harms to the broader scope of age-appropriate design and intentionally designing services with children in mind. For example:
    1. Article 16 of the UNCRC, which sets out the privacy rights of children globally, must be interpreted and applied alongside Article 17. Article 17 sets out children’s information rights, and particularly their right to access information, “aimed at the promotion of [the child’s] social, spiritual and moral wellbeing and physical and mental health.” It further directs State Parties to the Convention to “encourage the development of appropriate guidelines for the protection of the child from information and material injurious to his or her well-being, bearing in mind the provisions of articles 13 and 18.”
    2. The UK’s Age-Appropriate Design Code, also known as the Children’s Code, is a statutory code of practice created under Section 123 of the Data Protection Act 20183, as amended to implement UK General Data Protection Regulation (GDPR)4 provisions for children’s data privacy. It is widely heralded as a global best practice. The protections for children’s privacy must meet or exceed those required by the UK standard and must be supported by legislation that allows for robust and effective enforcement of these laws.
    3. A further relevant benchmark is the recently published guidelines of the European Commission relating to the protection of minors under the Digital Services Act5in addition to benefiting children’s well-being, aligning Canadian industry standards for children’s privacy with Europe’s could be beneficial to encourage trade and export opportunities for Canadian developers and service providers.
  2. The measures proposed for age-appropriate design to protect and promote children’s privacy interests should apply in both public and private sectors, and in all Canadian jurisdictions, unless a substantially similar code is in place in a province. In such cases, coordination with that province is required so the coverage is consistent across Canada.
  3. OPC guidance under PIPEDA cannot adequately achieve the goal of consistency across Canada or meet Canada’s enforcement obligations under the UNCRC. Specific legislative provisions within new or existing laws concerning children’s privacy are the appropriate vehicle to begin this reform. The OPC’s welcome efforts to provide guidance within the current PIPEDA framework does not supplant the need for more thorough ongoing law reform.
  4. The CBA Sections submit that robust implementation of children’s privacy rights could provide a powerful lens to protect children’s privacy, and through that lens, their right to solitude, their right to a private and supportive family life, the protection of their reputation, their control over the use of their image and likeness. The European Court of Human Rights (ECtHR), Article 8 jurisprudence, is an example of such a lens in actions. Canadian Charter jurisprudence has not sufficiently developed a jurisprudence on children’s rights to date. Children’s privacy rights must be interpreted and applied in keeping with the general principles of children’s rights and all their other rights guaranteed under the UNCRC Convention. Meaningful implementation of children’s privacy rights in Canada requires not only dedicated efforts by industry and oversight bodies such as the OPC in relation to PIPEDA, it requires a cross-sectoral approach and integrative collaboration with criminal justice systems, health systems, educational services and child protection services – to name a few – as well as general human rights oversight mechanisms to respect the intersectional interdependence of children’s privacy interests with all of their other human rights. Structural change and formal mechanisms for joint implementation of children’s rights in relation to the digital environment are required for full implementation of the rights of the child as set out in General Comment 25.6 “The OPC’s leadership in providing PIPEDA guidance is important and welcome. However, to effectively champion the interests of Canadian children, the OPC must also collaborate across government departments, with industry, and among different levels of government.
  5. The OPC’s guidance within the current PIPEDA framework should also be developed in conjunction with guidance in relation to public sector service offerings impacting children under the Privacy Act.7 Moreover, detailed guidance should be available as to how health technologies and information management practices are developed to protect children’s privacy interests in the health sector, both in its private and public dimensions.

Should a children’s privacy code apply differently to sites exclusively directed at children and those directed at a broad audience that includes children?

  1. Age-appropriate design guidance must apply to services and products designed specifically for children as well as to services that children may be reasonably foreseen to access or where they are the subject of information being disseminated. The question should not be whether the intended audience contains a “significant number of children” but rather whether children’s use and access of the services is reasonably foreseeable. Concerning the second issue, age verification and age assurance standards in the UK and in Europe are moving towards regulation based on whether children are likely to access the content which addresses when children are users, but there is also a need to address situations where children are not the users but are the subject of the information being disseminated as that also impacts their right to privacy. For example, an app used by a daycare provider to communicate with parents of young children, while not accessed by a child itself, may contain and disseminate significant amounts of information about the child. For adult-only services the onus must be clearly placed on the service provider to monitor age-based access and ensure that the services are not available to children, and that the content being made available does not violate the rights of children (such as, by way of example, child sexual abuse material). The UK’s recent guidance to the industry in relation to highly effective age verification practices for adult sites is a helpful reference for Canadian policy and guidance around children’s access to online services.
  2. Drawing from Dr. Ignacio Cofone’s work, OPC guidance should anticipate that, moving forward, the regulation of civil liability for harms to children from online play, work, learning and social environments needs to be approached on both tort and contractual law bases (The Privacy Fallacy). Moreover, issues such as intellectual property rights of children (to their image and likeness, to the content they create) also must be addressed in a way that allows them to realize their creative goals and enforce their rights. While some of this may be accomplished via privacy related guidance and related legislative amendments, other areas of law will also need to be engaged.
  3. A children’s privacy code should not apply differently to sites exclusively directed at children and those directed at a broad audience that includes children. First, websites are rarely exclusively directed at children, and so the distinction will find little application. For example, while Kids Help Phone provides national resources for youth, it also offers content to support parents. Kids Help Phone, though directed at children, is not exclusive to them. The baseline application would introduce unnecessary complexity.
  4. Depending on the definition of “exclusive” and the additional obligations that arise, if the definition is too strict, other sites may introduce content targeted at adults to exempt themselves from additional obligations. Instead of encouraging adherence to norms, companies may be motivated to escape them. The alternative – an expansive definition of “exclusive” – becomes functionally equivalent to a likelihood-of-being-accessed-by-children test.
  5. International trends are moving away from a two-tiered approach. In the United States, the Children’s Online Privacy Protection Act (COPPA)8 historically applied to websites directed at children under 13 or services with actual knowledge that a user is under 13. This meant many general sites avoided COPPA unless they knowingly collected data from young children. However, new U.S. proposals and state laws are broadening the scope. For example, California’s Age-Appropriate Design Code (CAADC)9 (effective 2024) covers any online service “likely to be accessed” by minors (under 18) – including general audience sites – not just services overtly aimed at children. In Europe, the approach is similarly broad. The UK’s Age-Appropriate Design applies to all digital services likely to be accessed by users under 18, even if the primary audience is adults.

Which factors should be considered when determining the likelihood of children accessing a service?

  1. The UK Information Commissioner’s Office (ICO) has enumerated, in their FAQ, several factors that are instructive. These include, among other factors: the nature and content of the service; whether that has particular appeal for children; the number of children users of a service; whether advertisements on a service, including third party advertisements, are directed at or likely to appeal to children; complaints received about children accessing a service; and, conversely, any measures put in place to prevent children gaining access to content.
  2. As technology and business practices continue to evolve, so will the quantity and quality of available evidence. Any evaluation of children’s likelihood of accessing a service should follow a common-sense approach.

How can this assessment be done in a privacy-protective manner?

  1. In its October 2024 submission on Privacy and Age Assurance – Exploratory Consultation10, the CBA Privacy and Access Law Section emphasized a risk-based approach. This approach emphasized that any requirements around privacy should be proportional to the level of risk posed to children based on the kind and quantity of information being collected about them. A similar rationale can apply to the requirements imposed on website developers: more intrusive age-verification may only be required if there is a substantial risk to children accessing a website’s content.

Should a children’s privacy code only apply when certain risks or harms are possible due to access to or use of the site – and if so, which ones?

  1. A children’s privacy code should always apply, but using a proportional, risk-based approach. This would align with international trends, Canada’s historical approach, and the complexity of how often children access certain websites, and how that information impacts them.

When considering risks and harms, again, the UK’s recent guidance to the industry in relation to highly effective age verification practices for adult sites is a helpful reference for Canadian policy and guidance in the area of children’s access to online services (Quick guide to Protection of Children Codes).11 In addition to these harms, we recommend the OPC consider the risk of discrimination against certain vulnerable intersections, including disabled, Indigenous, and 2SLGBTQ+ children. To fully understand these harms, intentional efforts should be made to seek comments from these communities.

IV. Designing to address privacy impacts and the best interests of the childiv. Designing to address privacy impacts and the best interests of the child

The CBA Sections agree with the need to include risk assessment in privacy impact assessments (PIAs) involving children and also that Livingstone’s categorization of the 4Cs of online harm to children, related to content risks, conduct risks, contact risks and contract risks offer a helpful categorization of harms to be considered20 The United Kingdom’s Information Commissioner Office has also provided a useful list of harms in their age-appropriate design code at section 2 (Data protection impact assessments).21 They explain that harm can be physical, emotional, developmental or material, and provide some examples:

  1. physical harm;
  2. online grooming or other sexual exploitation;
  3. social anxiety, self-esteem issues, bullying or peer pressure;
  4. access to harmful or inappropriate content;
  5. misinformation or undue restriction on information;
  6. encouraging excessive risk-taking or unhealthy behaviour;
  7. undermining parental authority or responsibility;
  8. loss of autonomy or rights (including control over data);
  9. compulsive use or attention deficit disorders;
  10. excessive screen time;
  11. interrupted or inadequate sleep patterns;
  12. economic exploitation or unfair commercial pressure; or
  13. any other significant economic, social or developmental disadvantage.

However, to protect children’s right to privacy, PIAs alone are not sufficient. A standard which seeks to intentionally protect and promote children’s privacy by design should include a full Child Rights Impact Assessment (CRIA), recognizing the interdependence among all children’s rights and, in particular, the strong nexus between a child’s privacy, general principles of children’s rights and their enjoyment of their family life, their freedoms of conscience expression and belief, their enjoyment of their right to play and minority linguistic, cultural and religious rights, as well as their right to be protected from all forms of violence. The best interest of the child demands that service providers and content developers providing content or services that are likely to be accessed by children turn their minds to the development of the whole child and how their offerings might impact that end user. Special regard is also needed for vulnerable populations of children including children in poverty, children with disabilities, LGBTQ2+ youth as well as street children, unaccompanied migrants, and children from minority or indigenous communities. The question is not how can PIAs be augmented by a best interest of the child lens. The question must be: how can CRIAs accompany and augment PIAs when designing products or services that children may access?

When considered in this way, it becomes clear that child participation becomes essential in the impact assessment process. Scotland and Wales have developed robust practices to support child participation in CRIA processes and the OPC should look to those practices for guidance as to how to develop similar practices in Canada.22 Using a child rights lens, the forms of harm from the digital environment may also take on a broader scope encompassing not only physical but mental, emotional and psychological harms to children, the impact of their increased sedentariness or isolation from non virtual and in-person contact time with their friends, peers and family members. The specific privacy rights of children in detention, in formal systems of care, in hospital settings, and educational settings are also made more explicit through CRIAs. The Federal Department of Justice has recently developed a helpful CRIA tool and the OPC should adopt it in its own processes.23 The OPC should also recommend its use and adoption by industry, together with the PIA process, when designing with children in mind.

V. Ensuring child-appropriate transparency practices

In A.B. v. Bragg Communications Inc24, Abella, J. speaking for the court acknowledged the deep roots in Canadian law respecting the inherent vulnerability of children, recognizing the need for differentiated approaches as a general standard, not an individuated one, since “the law attributes the heightened vulnerability based on chronology, not temperament.” As the OPC points out, privacy regulators elsewhere have taken this a step further when it comes to designing digital environments that consider children’s evolving capabilities, as recognized in Article 5 of the UNCRC. The CBA Sections recommend the adoption in Canada of guidance like the UK Commissioner’s guidance on transparency under principle 4 of the UK Children’s Code Transparency25 and as further particularized in Annex B of that Code on children’s age and developmental stages 26

Guidance on transparency alone, however, will not provide a sufficient, timely or effective tool to correct course in Canada, since the OPC’s own research shows that over 96% of websites and apps use privacy policies that are overly lengthy, technical and confusing for adults and children alike. Legislative measures with enforceable sanctions and penalties are required to ensure that industry follows the required guidance as to practicing child privacy by design and designing intentionally with children in mind as potential end users. When school boards initiate multi-billion-dollar civil suits to recover system wide impacts of mental health costs and educational loss suffered by Ontario children because of the design choices of social media giants,27 and when Australian legislators act to ban social media use by children under sixteen,28 the chasm between youth and parental privacy expectations and industry and regulatory efforts to meet them looms large. Public guidance from regulators regarding these transparency practices needs to be accompanied by new legislated standards with robust enforcement mechanisms, particularly to sanction industry players who engage in deceptive practices or inappropriate targeted behavioral marketing, collection or use of children’s data about children’s online activities. Beyond legislation and enforcement measures, significantly more funding is required on public education to raise awareness among parents, children and youth of the online risks that children may face and provide meaningful guidance to assist in navigating the ever-increasing complexity of online platforms and services. Better outreach and funding for education, awareness and research activities by both government and non-profit organizations should be part of the public sector’s response.

When it comes to transparency about practice, no information should be presented strictly to parents/guardians (or trusted adults). Instead, the OPC should adopt an approach that follows the spirit of UNCRC art. 529, which requires that parents assist children in understanding and exercising their rights. To this end, all information should be made understandable by children as much as possible. Where it is too complex to be reasonably appreciated by the targeted age group, children should be instructed to have their parents read and guide children on how to use the product further.

VI. Being privacy protective by default

The CBA Sections agree that organizations should by default turn off tracking of children, including location tracking. We question, however, whether the criteria of necessity for service functionality while the application or service is in use and where the child’s best interests support such tracking are sufficient or necessary exceptions to the standard. The UK Children’s Code guidance on default settings offers that the ICO “will look carefully at any claims that a privacy setting cannot be provided because the personal data is needed to provide the core service” and includes in its Annex C guidance on what constitutes a “lawful basis for processing30”. Globally, regulators are calling into question the business model of many online service providers and whether that business model is appropriate or exploitative as it relates to children. Default privacy settings even when set very high may not be sufficient or appropriate if service providers can incentivize children to turn them off or modify them to less privacy protective settings. The OPC must set clear limits through child-specific guidance on inappropriate data practices under subsection 5(3) of PIPEDA. Targeted content delivery and marketing, live-streaming and image-sharing with unknown individuals, and the use of precise geolocation and biometrics should all be assessed under that lens.

The CBA Sections urge the OPC to adopt guidance consistent with the advice of the Committee on the rights of the child and in particular its General Comment 2531 which provides in part that:

41.    States parties should make the best interests of the child a primary consideration when regulating advertising and marketing addressed to and accessible to children. Sponsorship, product placement and all other forms of commercially driven content should be clearly distinguished from all other content and should not perpetuate gender or racial stereotypes.

42.    States parties should prohibit by law the profiling or targeting of children of any age for commercial purposes on the basis of a digital record of their actual or inferred characteristics, including group or collective data, targeting by association or affinity profiling. Practices that rely on neuromarketing, emotional analytics, immersive advertising and advertising in virtual and augmented reality environments to promote products, applications and services should also be prohibited from engagement directly or indirectly with children.

And also:
68.    Data may include information about, inter alia, children’s identities, activities, location, communication, emotions, health and relationships. Certain combinations of personal data, including biometric data, can uniquely identify a child. Digital practices, such as automated data processing, profiling, behavioural targeting, mandatory identity verification, information filtering and mass surveillance are becoming routine. Such practices may lead to arbitrary or unlawful interference with children’s right to privacy; they may have adverse consequences on children, which can continue to affect them at later stages of their lives.

69.    Interference with a child’s privacy is only permissible if it is neither arbitrary nor unlawful. Any such interference should therefore be provided for by law, intended to serve a legitimate purpose, uphold the principle of data minimization, be proportionate and designed to observe the best interests of the child and must not conflict with the provisions, aims or objectives of the Convention.

Retention of a child’s personal information should be informed by the needs, benefits and risks to the child stemming from such retention. Children must also have an easy way to seek correction or the erasure of any retained information. We view such measures as more important and better suited to the best interests of children than default retention settings. Where appropriate and safe, we should favor developing tailored practices which give children increased agency over the use of their data rather than general measures which give children less agency.

VII.Avoid deceptive practices

Reports that children are more at risk from deceptive practices than adults are especially troubling. The OPC Code guidance should be clear and unambiguous that deceptive practices are unethical and contrary to children’s best interests. Law reform is needed to ensure that these practices by industry are punishable and are appropriately sanctioned.
However nudging practices which encourage children to adopt privacy protective practices online are appropriate when used in a clear and transparent fashion. Here again, educational interventions need state and industry support to make children and parents aware of deceptive practices and how to guard against them.

VIII. Limiting disclosure of children's information

The CBA Sections agree that the principle of limiting disclosure of children’s personal information takes on heightened importance given children’s vulnerability. However, it is also true that children’s best interests often demand transparency and the sharing of personal information to advance their right to health, life, survival and development. This is an area where discretion needs to be exercised at an individual level across many systems. Children’s best interests is doubtful to be advanced by rules or guidance that suggests that some children, or some categories of personal information should never be disclosed under any circumstance. General criteria of necessity, reasonableness, risk of harm and best interests should always prevail.


The OPC Code guidance should emphasize the principle of limiting collection of children’s information. In many cases, harmful use and disclosure of children’s personal information can be mitigated at the source by circumscribing the initial collection of such personal information. Strong data minimization principles in relation to children’s data make good sense but must be implemented in specialized and differentiated ways to ensure we address truly harmful practices without crippling beneficial practices at the same time.

IX. Enforcement

Considering that children’s right to privacy and protection of their personal information is the cornerstone of the OPC’s Code, it is imperative then that such Code should implement mechanisms for enforcement and consequences for violations and encompass fines and penalties.32

The UK ICO has various powers to act for a breach of the UK’s Age-Appropriate Design Code, such as the power to issue warnings, reprimands, stop-now orders and administrative fines. For serious breaches of the data protection principles, the UK ICO has the power to issue fines of up to €20 million or 4% of the annual worldwide turnover of those out of compliance, whichever is higher. The California Age-Appropriate Design Code Act33 (CAADCA) was slated to take effect on July 1, 2024; however, the U.S. Court of Appeals for the 9th Circuit upheld part of the preliminary injunction granted by the US District Court on First Amendment grounds in NetChoice v. California Attorney General.34 The CAADCA is enforceable by the California Attorney General. Before initiating any action, the Attorney General must provide a business in “substantial compliance” with the Act a 90-day period during which to cure any alleged violation. The Act does not include a private right of action. Remedies for violations include injunctive relief and civil penalties ($2,500 per affected child for each negligent violation and $7,500 per affected child for each intentional violation).COPPA imposes certain requirements on operators of websites or online services directed to children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age. The Federal Trade Commission (FTC) is responsible for investigating and bringing legal action against companies that violate COPPA. Companies that fail to comply with COPPA regulations can face fines and criminal penalties. The FTC may also require companies to implement new privacy policies and procedures to ensure compliance with COPPA. In addition to enforcement by the FTC, COPPA violations can also be reported to state attorneys general or consumer protection agencies. Noncompliance can result in civil penalties.

Should the OPC enact the Children’s Privacy Code, it is recommended that consideration be given to future legislative amendments to provide enforcement powers to the OPC, including the power to impose administrative fines.

X. Conclusion

Pending broader legislative reform, the establishment of a Children’s Privacy Code constitutes significant advancement. The CBA Sections support finalizing this Code in advance of, or concurrently with, any new statutory measures governing the collection, use, and disclosure of children’s personal information during commercial activity in Canada.


We remain available to discuss any aspects of the foregoing with the OPC.

Endnotes

1 Personal Information Protection and Electronic Documents Act, S.C. 2000, c. 5.

2 United Nations. (1989). Convention on the Rights of the Child. United Nations Treaty Series, 1577, 3 online.

3 Information Commissioner’s Office. (2020). Age appropriate design: A code of practice for online services.

4 United Kingdom. (2018). Data Protection Act 2018 c. 12

5 GDPR (Regulation (EU) 2016/679).

6 The EU Digital Services Act (2024); online edn, Oxford Law Pro), online.See guidelines for the protection of minors.

7 Commission publishes guidelines on the protection of minors | Shaping Europe’s digital future, online.

8 Privacy Act,R.S.C., 1985, c. P-21.

9 Children’s Online Privacy Protection Act of 1998, 15 U.S.C. §§ 6501–6506 (Pub. L. 105–277), enacted October 21, 1998

10 CA Civ Code § 1798.99.28 (2024)

11. Submission on AgeAssurance, online (CBA: Ottawa, 2024)

12 Quick guide to Protection of Children Codes, online.

13 Supra, note 1, s. 6.1.

14 Consumer Protection Act, CQLRP-40.1, Section248

15 All in the Data Family: Children’s Privacy Online, online.

16 Google LLC v. The Privacy Commission of Canada, et al. 2023 FCA 200.

17 Supra, note 1, Schedule 1 Section 4.4.1

18 European Commission, “Guidelines on the Protection of Minors under the Digital Services Act (DSA),” published 14 July 2025, online.

19 Optional Protocol to the Convention on the Rights of the Child on the Sale of Children, Child Prostitution and Child Pornography.:online.

20 Livingstone, S., & Stoilova, M. (2021).The 4Cs: Classifying Online Risk to Children. CO:RE Short Report Series on Key Topics, online.

21 United Kingdom’s Information Commissioner’s Office, “Age appropriate design: a code of practice for online services,” online.

22 Scotland CRIA, online and Wales CRIA online.

23 Department of Justice CRIA, online.

24 2012 SCC 46

25 UK Information Commissioner's Office. UK Children’s Code Transparency, online.

26 Supra, Age and developmental stages, online.

27 2025 ONSC 1499 (CanLII) | Toronto District School Board v Meta Platforms Inc. | CanLII, online.

28 Social media minimum age | Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts, online.

29 Supra, note 2.

30 Information Commissioner’s Office (2020)Age Appropriate Design Code, online.

31 Supra, note 2.

32 In the absence of legislative reform empowering the OPC to impose fines and penalties, consideration should be given to the circumstances under which problematic practices may be referred to other agencies, such as the Competition Bureau, for enforcement, or where the OPC would pursue remedies in Federal Court following an OPC investigation and report.

33 CA Civ Code § 1798.99.28 (2024)

34 NetChoice, LLC v. Bonta, No. 5:22-cv-08861-BLF,United States District Court for the Northern District of California(2025).