The UK data protection regulator, the Information Commissioner Office (‘ICO’) recently issued notice of intent to TikTok Inc and TikTok Information Technologies UK Limited (‘TikTok’) for alleged breaches of children’s data.

TikTok is alleged to have:

  • Processed the data of children under the age of 13 without appropriate parental consent;
  • Failed to provide proper information to its users in a concise, transparent, and easily understood way; and
  • Processed special category data, without legal grounds to do so.

When an enforcement notice is not complied with, an appropriate and proportionate penalty will be issued, in this case £27 million in the form of a penalty notice. The ICO also stated that TikTok may have processed special category data, including data relating to ethnic and racial origin, political opinions, religious beliefs, sexual orientation, trade union membership and genetic, biometric or health data- without legal grounds to do so.

This follows similar enforcement action taken against Tik Tok in 2019 by The Federal Trade Commission (‘FTC’). On this occasion the FTC imposed a $5.7 million fine on TikTok for a breach of children’s data. The Dutch Data Protection Authority also issued TikTok a €750,000 fine for the breach of children’s data.

TikTok has 21 days to make a written representation about the ICO penalty and its proposed amount. A verbal representation can be made, but at the discretion of the ICO. For significant penalties, a panel may be convened to consider the representations made by TikTok, however, the ICO Commissioner has the final say. Once everything has been considered the Commissioner will confirm the penalty.

Regulatory Stance

Article 8 of the General Data Protection Regulation (‘GDPR’) requires that when digital services are being offered to a child with consent as the legal basis of processing, parental consent must be obtained where the child is below the ‘age of digital consent’, which can vary in different Member States. In the UK, the age of digital consent is 13, however, in Ireland it is 16.

Age-Appropriate Design Code

The Age-Appropriate Design Code, also known as ‘The Children’s Code,’ is a data protection code of practice for online services which are likely to be accessed by children. The Children’s Code code stems from existing data protection laws and was introduced in September 2021 in the UK. A similar Act has just recently come into force under California law, Age-Appropriate Design Code Act. Many jurisdictions are likely to adapt this code of practice in the near future.

Lessons to be learned

Data dignity is for everyone, especially those who are more vulnerable and require additional protection and age-appropriate privacy notices.

John Edward, the Information Commissioner commented, “We all want children to be able to learn and experience the digital world, but with proper data privacy protections. Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.”

As DPO service we aim to prevent such breaches from occurring, by advising our clients on current regulations. TikTok has the opportunity to amend these breaches to comply with the evolution of GDPR laws.

Key takeaways

  • Review your own business processes with respect to vulnerable individuals such as children or employees.
  • Consider if you are at risk of similar breaches.
  • Stay abreast of regulations in relevant jurisdictions and use them as your guide.
  • If your business is processing data belonging to children, create an age gate which users need to verify they are old enough to consent in line with local data protection laws.
  • Ensure that you always have an appropriate legal basis to process data.

Contact HewardMills for further support on sensitive data processing.

If you would like to discuss this topic or anything else data protection and privacy-related, please contact us at