Popular discussions on children’s data protection are often framed as a zero sum game between those who want children to be protected in the digital age and those who fear a chilling effect on content creation. But this is a false choice. It is possible to do both. Proper safeguards – including embedding privacy by design principles and obtaining verifiable consent – allows organisations both to abide by the law (protecting children) and facilitate children’s access to entertainment, education, and information through new and innovative content.

In light of new laws on data privacy, which seek to protect children in the EU and the US, this article explores pragmatic ways to balance these important interests.

US Regulatory Approach: Federal Trade Commission Action and Workshop

In September 2019, the Federal Trade Commission (FTC) announced that Google’s YouTube had agreed to pay a $170 million fine for violating the Children’s Online Privacy Protection Act (COPPA), a US federal law designed to protect children under the age of 13. The FTC found that  YouTube had illegally harvested personal information from children and used it to target them with ads. As part of the settlement with the FTC and the New York Attorney General, YouTube must now ask video channel owners to specify whether their content is intended for children. Furthermore, YouTube will treat data from anyone watching child content videos as “coming from a child, regardless of the age of the user” and stop personalised ads to watchers of such videos.

Shortly after the FTC fine and settlement (which sent shock waves throughout the high-tech industry), the FTC held a workshop on October 7, 2019 to explore if and how COPPA should be revised. The discussions addressed how technologies affect children’s privacy, how education technology vendors should address parental consent, how technology companies should deal with audio files, and whether 2013 FTC revisions of COPPA have worked as intended.

However, this workshop has been widely viewed as an attempt by high tech industry advocates to weaken the COPPA Rule. Indeed, the Director of the FTC’s Bureau of Consumer Protection has reiterated the industry’s main argument that increased privacy protection would harm consumer’s internet experience, in that, without targeted ads which enable creators to monetize their creativity, YouTube would be “a desert of crap.”

Before the workshop, a group of bipartisan US senators sent a letter to the FTC “strongly caution[ing] against undertaking a process that ultimately weakens children’s privacy instead of improving it.” The senators expressed concern that the “FTC is at risk of favoring the interests of giant tech companies over the interests of parents and children,” and advised the agency that if any changes should come from the workshop, children’s privacy and well-being must take priority.

The Wider Legal Landscape on Children’s Data Protection

These FTC developments come at a time when considerations of how to better protect children have been taking shape in other fora. The European Union’s General Data Protection Regulation (GDPR) and the new California Consumer Privacy Act (CCPA) that comes into effect on January 1, 2020 emphasize special protection for children. Under the GDPR, parents/guardians must provide consent before the personal data of a child under 16 years of age is collected or processed. Member states have discretion to lower this requirement to age 13.  However, when online preventive or counseling services are offered to a child, it is not necessary to have parental or guardian consent.

In California, consumers under the age of 16 have the right to opt in to have their information sold if a business has “actual knowledge” that the consumer is under the age of 16. “Actual knowledge” under the CCPA means that the business willfully disregards the consumer’s age. For children under the age of 13, Californian businesses must have affirmative authorisation by the child’s parent or guardian before selling a child’s personal data.

Under the COPPA, “actual knowledge” is not defined, but the FTC has explained that a user’s age can be determined when an operator of a website asks for and receives information that allows the operator to determine the user’s age. For example, the operator may ask for the date of birth to register on a site or app, or use “age-identifying” questions, such as, “What grade are you in?” Moreover, as YouTube found out, if a website/online service has content that is directed at children under the age of 13 and collects their information without parental consent, the provider may be in violation of COPPA.

Like the GDPR and CCPA, COPPA requires entities that collect children’s information to make reasonable efforts to provide direct notice to parents and obtain verifiable consent prior to collecting, using, or disclosing personal data of children under the age of 13.

The biggest difference between the US and the EU is that in the EU there are no exceptions for a controller not being aware that it is actively collecting a child’s personal data or offering services to children. However, as the FTC’s case against Google’s YouTube shows, having child-related content on a website, whether it is placed there actively or passively by permitting third-parties to upload content, can constitute having “actual knowledge” in the US.

Balancing Children’s Data Protection and Business Growth

While the US authorities are discussing substantive ways to change COPPA to balance children’s privacy and business growth, countries like the UK are working to improve current child privacy standards. In April 2019, the Information Commissioner’s Office (ICO) rolled out a code that sets out 16 standards for age-appropriate design for online services. These standards are designed to strengthen children’s data protection while also allowing them freedom of expression, access to information, and the ability to engage in recreational activities appropriate to their age.

Similarly, the Irish Data Protection Commissioner published a preliminary report (Stream 1) in early September 2019 after a public consultation to address questions on children’s data protection rights under the GDPR. Some of the issues highlighted in this successful consultation included the role of parents/guardians in the context of children’s’ personal data rights, the feasibility of checking a child’s age using alternative verification methods, and procedures to better operationalize data protection by design and by default.

The $170 million fine against YouTube, $5.7 million fine against Musical.ly (now TikTok) (for illegally collecting personal information from children), $32.5 million against Apple (for allowing children to make in-app purchases without parental consent), and $20,700 against a school in Skellefteå, Sweden (for experimenting with facial recognition on high-school students) show that the issue is serious and gaining traction both in legislative circles and the general public.

Protecting children’s privacy rights and balancing business interests does not have to be a zero sum game. It is possible for businesses to introduce cost-efficient ways to get verifiable consent from parents while also growing in a fast-moving market. Indeed, the role of a Data Protection Officer (DPO), as articulated in the GDPR, is to help businesses operationalise data governance and compliance processes. Data protection experts and practitioners in the field can be instrumental to driving innovation while ensuring that businesses are complying with the requirements of the jurisdictions where they operate.