In the author’s humble opinion, Zoom’s overcollection and storage of children’s personal information represent a privacy nightmare for parents and teachers.
Zoom teleconferencing has rapidly taken the Internet by storm thanks in part to a pandemic that caught the world unprepared. But it has also touched the world’s data in a very real way. The app has been a great way for teams to maintain some semblance of productivity during the long period of separation from work. For many people working from home, Zoom has been an essential enabler for connectivity and collaboration. Although responsible adults and businesses are theoretically able to make decisions about using whatever tools they choose - or at least to skim privacy policies and data transparency reports - children do not have that freedom.
The company has since clarified that it doesn’t so much sell the data as share it with its partners as needed. But even if it is not blatantly and openly commercialized, there are numerous ways in which children’s information can be abused, from identity theft, to AI-powered data mining, to targeted advertising, surveillance, stalking, privacy invasions and breached data being sold to the highest bidder. But this is not an article on FUD. It is simply a pre-emptive attempt to defuse the snarky contrarian views of those likely to adopt the role of advocatus diaboli, because smart-alecks masquerading as sophisticated iconoclasts are never too far away, even in matters involving human rights and the safety of children’s data.
Parents and teachers are typically required by school districts to use technology such as Zoom because it is broadly accessible, consistently compatible, and, as the company never tires of repeating, it just works. Unfortunately, no authoritative body has so far verified its claims of compliance with Europe’s GDPR or even Canada’s PIPEDA privacy laws, but New York’s Attorney General has recently initiated an investigation into the company’s privacy practices. It could also be said that the FBI has some concerns:
But the real reason so many have been unquestioningly adopting this potentially-intimate technology is simply due to its popularity. The recursive challenge of the argumentum ad populum demonstrates the vulnerability of the human brain to cognitive biases, but also hints at just how much resistance there is to the consumption of privacy policies, terms of service and other legalese.
The consent implied by the use of the tool serves as a ‘we told you so’ that may be acceptable in sectors where sensitive personal information does not constitute the bulk of the communication. In communications that involve minors, all data should be considered sensitive, thus placing a significant burden for conducting proper due diligence on the shoulders of public education authorities. Further, there should be a focus on the sharing of such due diligence among educational institutions to verify that it is properly conducted and to assist organizations with limited means to piggyback on the good work of qualified professionals. The current lack of such sharing presents a significant risk to public education with the growing number of edtech tools flooding classrooms under the guise of modernization.
Zoom is a technically a Chinese company (or three)
According to Toronto’s Citizen Lab, Zoom is made in China in a very real way. Despite being incorporated in the US, Zoom is the product of three Chinese companies and employing at least 700 developers located in China. Citizen Lab also indicated that a number of third-party companies have been established to sell the Zoom app within China.
Whether Zoom is an intelligence threat or poses risks to national security (à la Huawei) is outside the scope of this article. But its use of proprietary encryption and past claims of end-to-end protection are a grave concern to authorities — such as the New York City Department of Education (DOE) — whose clear mandate is now to block the application on all platforms in favour of more trusted videoconferencing and collaboration tools. As a reminder to the uninitiated reader, custom cryptography (or roll your own crypto as CitizenLab playfully calls it) presents surveillance risk and the real possibility of all data being accessible through a built-in backdoor. It is difficult-voire-impossible to prove a negative, so questions about the company’s ability to access the data at rest or in transit will likely persist for the foreseeable future.
For all intents and purposes, Zoom owns all the data it collects
Europe’s General Data Protection Regulation (GDPR), Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) and every other modern privacy legislation is built around the simple notion that information about individuals belongs to those individuals.
In contrast, China’s understanding of privacy is vastly different: the data belongs to the organizations that collect it and any such organizations must grant unfettered access for government inspection, in the name of safety and security. Article 77 of its Cybersecurity Law ensures that data is collected and stored in China where full transparency and access must be provided to the Ministry of Public Security. Period. It is unclear just how much Zoom data is stored or archived in China, but if it can be inspected, decrypted or accessed by/for Chinese authorities as Citizen Lab’s research indicates, chances are that storage would also be taking place in that country.
In response to Citizen Lab’s report, Zoom produced a short blog on April 3rd, 2020 entitled “Response to Research From University of Toronto’s Citizen Lab”. In his 762-word response, CEO Eric Yuan disputed none of the findings but apologized for the oversight, indicating that call routing had been corrected. Although neither privacy nor surveillance were mentioned, Yuan acknowledged “that we can do better with our encryption design”.
Zoom’s collection and disposal of student data is inadequate for educational use
And disposal? What disposal?
In Canada, the official policy position of the federal Privacy Commissioner (OPC) is that given the practical obstacles to obtaining meaningful consent from children, especially implied consent, organizations should avoid knowingly tracking children and tracking on websites aimed at children.
In all but exceptional cases, consent for the collection, use and disclosure of personal information of children under the age of 13, must be obtained from their parents or guardians. — OPC
The company makes it clear that it collects enormous amounts of data, keeps it in unspecified locations for unspecified amounts of time. And there’s nothing that parents or students can do about it. Privacy policies and terms of service are opportunities for companies to clarify, not obscure their information handling practices. It is precisely this uncertainty around the confidentiality, storage, disposal and access to the terabytes of personally identifiable information being transferred on a daily basis that makes Zoom an unacceptable option for use in public education. This, from NPR:
It would be irresponsible for any Canadian school board, U.S. school district or any school to recommend or support the use of the platform under current circumstances, and we haven’t even touched upon its security problems, which, according to an April 1st blog post by Zoom’s CEO, will take months to repair.
Think carefully about the importance of demonstrating to children the correct way to adopt new technology, especially popular innovations that have been rapidly adopted without adequate oversight. Such edtech is increasingly trusted to protect student data, but it can easily compromise an entire generation due to lax controls over data sharing and disposal.
Parents and teachers must assume the role of responsible gatekeepers even if it triggers the unpleasant feeling that they’re up against a popular trend. Education authorities need to rapidly ramp up their security testing and audit capabilities to urgently conduct due diligence on the avalanche of education technologies that have flooded the market over the past decade. Zoom is a perfect example of those challenges. It’s slick, useful and disastrously risky for any exchange or creation of sensitive data.
In a March 30, 2020 press release, the Federal Bureau of Investigation warned of videoconferences being disrupted by pornographic and/or hate images and threatening language. In addition to recommending due diligence and caution, the FBI suggests taking the following steps to mitigate teleconference hijacking threats when using Zoom. Although the company said that it has taken steps to remediate some or all of the reported security problems, these are good practices for meeting organizers:
- Do not make meetings or classrooms public (require a password instead)
- Do not share a link to a teleconference or classroom openly on social media (provide individual links to specific attendees instead)
- Change screensharing to “Host Only.”
- Ensure users are using the updated version of the software
Zoom’s disarming convenience and unquestionable utility during the global COVID-19 crisis create a cognitive dissonance that’s difficult to complain about. But we need to recognize the simple reality that a business model designed to live on data is incompatible with the interests of vulnerable sectors. And by the way, no one is saying that the company is especially predatory in its processing of children’s information or student data, but at least in my opinion, its current approach to privacy protection should immediately disqualify Zoom from being used in public education or any use involving children.
The company employs lawyers to handle all matters related to privacy, but assurance for the proactive protection of children’s data is not, at its core, a matter of law or compliance. School boards and public education bodies must work with qualified professionals experienced in protecting the interests of children and families. Privacy also a matter of ethics. In countries that allegedly benefit from modern privacy legislation, the rights of data subjects should be ardently protected and enforced, otherwise an entire generation of young people runs the risk of reaching the age of majority and realizing those rights have been eroded into irrelevance by the apathy, incompetence and carefree insouciance of their elders.
There is a vast difference between happiness and trust, rhetoric and propaganda. In this week’s blog post, Zoom’s CEO wrote “We know we have a long way to go to earn back your full trust, but we are committed to throwing ourselves into bolstering our platform’s security”.
While mention of privacy is conspicuous by its absence, its importance cannot be overstated. The yin/yang duality between these inseparable notions is a reminder that just as edtech companies acknowledge their responsibility to get security right, it is even more important to demonstrate an absolute commitment to the protection of privacy precisely in light of the diverging interests, interests and objectives of governments, commercial entities and human beings.
In that sense, Zoom’s situation in itself is a great opportunity for education.
We shouldn’t miss it.
In my humble opinion.