Claudiu’s comments on the privacy protection promises aligned with the introduction of Bill 188 of 2024

Protecting Children’s Data: What Will It Take?

Bad Privacy Blog by Claudiu Popa
10 min readMay 17, 2024


Proposed legislation takes a multifaceted approach to student data privacy, but do these ideas make sense — and will they stick?

Last week, as part of its initiative to “transform child and family services to improve children and youth well-being”, Ontario’s provincial government introduced the Supporting Children’s Futures Act of 2024 with a broad mandate spanning oversight of foster care and group homes, stricter regulation of children’s aid societies and expanding background checks for individuals working with vulnerable youth.

This Bill proposes enhancements to modernize the legislative and regulatory framework of the Child, Youth and Family Services Act, 2017, and various other Acts, by the Minister of Children, Community and Social Services was echoed by the Minister of Public and Business Service Delivery and the Minister of Education, each one adding positive and encouraging commentary on the need to protect children’s personal information; something that is beyond dispute.

Sounds good. What about privacy?

Included in this new bill are recommendations for “Enhancing privacy protections of those who are currently or were formerly in the child welfare system by further restricting access to personal childhood histories and protection records.

This sounds great, as it indicates the fact that legislators and their advisors have recognized the need for clear protections for data that is stored over long periods of time, as part of longitudinal studies, for example. It goes on to say:

The government is increasing the protection of personal and private information of individuals who have previously been in care or have had interactions with the child protection system as children or youth. Proposed regulations will be developed to further protect the private details of a person’s childhood experience and reinforce their control over their information, while also allowing individuals to share their childhood histories on their own terms.

This is where things get interesting, particularly due to what is not said. As everyone knows, the best way to protect privacy is to not store data in the first place. This summary description implies that records are kept for a very long time without allowing for data to eventually become deprecated and deleted once obsolete. It allows for individuals to ‘share their childhood histories on their own terms’ but what if those terms include the right to have those histories forgotten?

The devil is in the details

It is worth pausing on heavy sentences such as “protect the private details of a person’s childhood experience” as it prompts many questions about who decided what details to keep and why no mention is made of the critical need for a mechanism for not collecting or deleting details that are not necessary.

As just about anyone can imagine, overcollection of data can significantly increase the risk of breaches because it creates more opportunities for sensitive information to be exposed. This is supported by the rising number of data breaches involving large amounts of personal data. According to Apple, some 80% of breaches arise from personal information stored in “the cloud”.

The inclusion of the term “modernize” in the Bill can safely be taken as a hint that cloud storage of children’s personal data is part of the plan for the proposed safeguards and selective sharing of this sensitive data.

visit xkcd: The Cloud

As an aside, I’m old enough to remember one of the most sinister uses of the term ‘modernize’ when the York Regional District School Board used it to bypass parental consent and make a far-fetched case for the need to ‘modernize’ the simple classroom attendance process by moving the process to the cloud and into the hands of a local edtech company. The move eventually resulted in the large scale ‘modernization’ of numerous other internal processes that have made the Board irreversibly dependent on ‘the cloud’ and vulnerable to any number of embarrassing privacy-related snafus that have since befallen the institution.

[First thing that comes up on Google when combining the terms ‘modernize’ with ‘YRDSB’]

To conclude our discussion about details, Bill 188 (Supporting Children’s Futures Act 2024) has, in fact, very little to say on the topic of privacy, but in supplement to the aforementioned paragraph on “reinforcing control” is this additional comment:

the government is proposing changes that would enable the Ontario College of Social Workers and Social Service Workers to share information with other professional governing bodies, such as a college governing social workers in another province. In certain circumstances, information could also be shared with other entities, including children’s aid societies, to confirm when a social worker or social service worker is under investigation or to prevent or reduce a serious risk of harm.

In order to reduce the risk of harm, controls must be in place to prevent certain data mis-use and abuse situations that will at a minimum require:

  • conducting due diligence on every entity with access to personal information or even metadata (derivatives of information about the information collected from children, regardless of consent)
  • ensuring that security controls are in place for confidentiality protection and secure data disposal
  • validating that interprovincial privacy standards are harmonized

Most importantly, a mechanism needs to exist where the Ontario College of Social Workers and Social Service Workers is able to revoke access and enforce the verifiable deletion of previously shared information in the custody of any other entities, in a timely fashion.

Interconnected Bills

On the heels of the above Bill 188 comes additional proposed legislation for Safeguarding the Data of Children and Youth announcement, which specifies privacy safeguards that the Minister of Public and Business Service Delivery introduces as follows: “Our government wants our children to have a healthy, safe and age-appropriate digital experience when engaging with public sector organizations like schools which is why we are safeguarding their best interests by putting guardrails in place to better protect them.”.

The press release goes on to say that the Ontario government will collaborate with school boards, parents and groups overseeing children in provincial settings to ensure the right protections are introduced without affecting the quality of education or interfering with schools’ ability to choose the right tools for the classroom.

Ironically, as an example of children’s data used inappropriately, the press release mentions “the recent ransomware occurrence at a large Ontario District School Board”, referring to the still-ongoing investigation into the latest data breach at the York Regional District School Board.

Overall, the proposed legislation would strive to “strengthen data and privacy protections, cyber security, responsible use of AI and customer service delivery”, all good things that will require significant availability of unbiased expertise and deep experience in the field. These assets having historically been scarce in public education, the Ontario government will hopefully lean on its own resources for qualified sources and certified professionals.

On that note, Stephen Lecce, Minister of Education said: “After removing social media from school networks and devices, restricting cell phones in class and banning vaping, our government is taking additional action to further safeguard our children while they are online. We will be bringing in social media and tech industry experts to discuss how they can further curtail risks, specifically focused on cyberbullying, age-appropriate access to content, and cracking down on risks to kids while online. This is about protecting the physical and mental health of children.”

The announcement also acknowledged that in 2023, the Law Commission of Ontario issued a recommendations paper citing the lack of protections for youth, elderly and other vulnerable communities against risks in the digital marketplace. I tracked down that document here.

While the announcement is otherwise very vague on the security protections being put forth, it implicitly acknowledges that past procurement mistakes and mishandling of sensitive information by school boards and other institutions have been at the core of privacy breaches, future seafeguards will seek to prevent their recurrence.

The regulatory changes would enable the creation of protections to better safeguard children’s information from being stolen or used inappropriately due to cyber incidents. Future regulations could include age-appropriate standards for software programs on devices, like laptops, used by students at school and strengthened standards for software procurement by schools to avoid the usage or selling of student data for predatory marketing by third parties.

All of this is absolutely necessary and well overdue. It is clear that although the public hears very little about the privacy violations and security breaches that involve children’s personal information, such incidents directly inform the need to articulate proposed legislation in this particular manner.

The influence of underreported breaches

Take for instance the statement “the Ontario government will collaborate with school boards, parents and group that overseeing children” and “the majority of programs for very young children collect their digital identifiers and share them with third party marketing companies”. Does that sound like most of the breaches and violations that Canadian families have been subject to for the past decade? Absolutely.

Can we go back and put those organizations on the spot? Not really, as it might hurt feelings, expose the inadequacy of investigations and the enforcement failures of regulators. But can we agree that a forward-looking plan stands to unify efforts and introduce accountability if properly designed and implemented? Yup!

The… protecting people online thing

Enter the “Ontario Strengthening Cyber Security and Protecting People Online” announcement for The Strengthening Cyber Security and Building Trust in the Public Sector Act, 2024 . That’s a mouthful, but no one said that these things needed to have a great title, only a descriptive one.

This sweeping piece of legislation promises to also strengthen safeguards for children’s personal information and include a foundation for the ethical use of A.I. in the public sector. A tall order, but what is meant by “also”?

  • Strengthening cyber security in the public sector. This includes critical sectors such as hospitals, schools and children’s aid societies. The legislation will help these organizations prevent and rapidly respond to cyber threats and attacks and minimize service interruptions, ensuring these organizations can continue to operate even when breaches occur.
  • Safeguarding the data of children and youth from being stolen or used inappropriately with stronger privacy protections when they are in settings like schools. Future regulations could prevent the misuse or sale of student data for predatory marketing by third parties, ensuring children are not unduly targeted or exploited by technology providers.
  • Modernizing privacy protections. Increase the authority of the Information and Privacy Commissioner of Ontario (IPC) to investigate and respond to privacy breaches and inappropriate use of personal data and mandating organizations to complete privacy impact assessments.
  • Building a strong foundation in artificial intelligence (AI) governance to solidify Ontario’s leadership in the responsible adoption of AI and emerging technologies. AI has the potential to transform vital programs and enhance services for the people of Ontario and we are ensuring it is used in a transparent, accountable, and ethical way.
  • Improving online customer service delivery. With the proposed changes, Ontarians who choose to opt-in can enjoy a more efficient experience with government services. The introduction of “tell us once” features means users will not have to repeatedly enter the same information during their interactions. This not only speeds up processes but also reduces the potential for errors, making government services more user-friendly and effective.

This is followed by a surprising statistic: With more than 400 artificial intelligence firms and institutions, our province is at the centre of an AI-enabled future. The chance that a significant percentage of such startups are taking ‘ethics’ and ‘privacy’ into consideration at this stage is slim to none, perhaps underlying the need for safe and responsible AI application design and development.

The regulation and validation of ethical AI is critical to the protection of children’s data from this point onwards. As stated in the bill, “future regulations could prevent the misuse or sale of student data for predatory marketing by third parties, ensuring children are not unduly targeted or exploited by technology providers.

This clearly indicates awareness on the part of the proponents of these bills that such protections are not consistently in place, particularly in situations where school boards select vendors without adequately validating their security, monitoring service levels, enforcing privacy, investigating their ownership structure, reviewing terms of service, scrutinizing policies and conducting due diligence into their supply chains.


For my part, this acquiescence is a ray of sunshine. It indicates that the interest and the will do exist where they previously did not. I would optimistically hope that provincial organizations will no longer downplay privacy violations and sweep data breaches under the rug, but even if some such incidents exist, a baseline of risk maturity is poised to increase over time.

While proposed legislation may initially amount to superficial checklists, I would hope that such initiatives will increase accountability and have a solid foundation of industry standards for data protection.

For the effectiveness of such legislation to work however, broad controls over the enforcement of informed privacy consent must be respected. Additionally, key elements must be in place to eliminate the problems that have been plaguing school board administrators and victimizing families:

  • Year End Data Deletion: annually purging all data collected and shared by educational technology, cloud organizations and the entire supply chain is the fundamental requirement for any legislation to have a hope for protecting children’s data. This has been pioneered in the US and is a measure that works to minimize available data while protecting students from future security and privacy breaches.
  • Provincial controls for service level monitoring: ensuring that school boards, children’s aid organizations and other provincial entities benefit from adequate degrees of awareness, accountability and oversight is equally critical in the prevention of violations and detection of data breaches in time to properly respond to them.
  • Third Party Risk Management: with the arrival of the global COVID 19 pandemic, an onslaught of data brokers masquerading as edtech companies entered the marketplace, effectively forcing public bodies to consider their underpriced offerings at face value. Their third and fourth parties have been feasting on student data for the better part of the past decade, but thanks to the ‘Pandemic bump’ and with the advent of AI, this widespread data sprawl threatens breaches of incalculable proportions. A TPRM program needs to be in place at the provincial level to cap future losses, while urgently eliminating vendors that have shown missed opportunities to demonstrate a commitment to accountability and ethical use of data.

Many atomic requirements, such as incident management, privacy impact assessments, information inventories and risk registers are absolutely needed to change the status quo, but this list of three key elements can be readily used by parents and experts alike as a simple checklist for the adequacy and potential effectiveness of proposed legislation.

By ensuring that the attack surface is reduced, organizations and administrators are held accountable for their own actions and their supply chains are responsible for the use of information, provincial bodies have a real chance to protect children’s data over the coming decades and set an example for national jurisdictions and international jurisdictions to follow.



Bad Privacy Blog by Claudiu Popa

Fīat jūstitia, ruat cælum. Personal musings on data protection fails, snafus & oddities, written & edited by Claudiu Popa; author, educator, booknerd.