Julian Hayes discusses ICO’s Children’s Code – Open Access Government

Julian Hayes discusses ICO’s Children’s Code – Open Access Government

BCL partner Julian Hayes’s article discussing the ICO’s new guidance on safeguards for the online treatment of children’s personal data has been published by Open Access Government.

Here’s an extract from the article:

“Neither blessed with a catchy title nor immediately in force, the Age Appropriate Design Code grabbed few headlines when it was issued by the Information Commissioner (ICO) in 2020. Now re-badged as the Children’s Code and in force from 2 September 2021, it is being feted as an early blow in the UK government’s wider campaign against online harms and in particular the risks to the privacy of minors. Broadly drawn, both in terms of the online service providers affected and geographic reach, the Code provides guidance on safeguards for the online treatment of children’s personal data, with compliance underpinned by the potentially severe enforcement powers of the UK GDPR. Sensing the way the wind was blowing, the tech titans had already modified their services, spurring calls for similar measures in other countries. Misgivings over the ambit and practical impact of the Code remain, however, particularly in relation to the thorny issue of age-verification.

Long reach of the Code

Built around 15 high-level standards, the Code specifies the requirements which providers of ‘information society services’ (ISS) must meet if their products are likely – that is, are more likely than not – to be accessed by under 18 year-olds in the UK. The requisite standards, for the most part relatively uncontroversial, include prioritising the best interests of the child when designing and developing online services, establishing the age of individual users with a level of certainty appropriate to the risk, and upholding published terms, policies and community standards.

ISS providers include the majority of online services used by children, from social media platforms, search engines and online marketplaces through to content streaming services, messaging apps and online games. No mere parochial affair, the Code’s geographical sweep takes in not only UK entities but also those elsewhere which offer their services to UK users or monitor their behaviour.

Legal effect & sanctions

Although a product of the Data Protection Act 2018 (DPA), and notwithstanding that courts must have regard to its provisions, the Code does not itself have force of law. Instead, it spells out the measures which ISS providers must meet if they are to fulfill their obligations under key aspects of both the UK GDPR and the less well-known Privacy and Electronic Communications Regulations (PECR) which govern online marketing and brought us cookie banners.

The ICO has issued dire warnings that failure to conform to the Code may invite regulatory audit and will make it more difficult for companies to demonstrate compliance with the UK GDPR and PECR, with the ultimate sanction of heavy – and headline-grabbing – financial penalties for non-compliance. Given the passion which the online safety of children arouses, those suspected of transgressing can expect little indulgence from aggrieved complainants highly motivated to bring their suspicions to the data watchdog’s attention.

Ripple effect

In the months before the Code’s implementation, tech commentators discerned a slew of modifications by social media companies aimed at improving the privacy as well as physical and emotional well-being of young people on their platforms. Instagram disabled targeted advertisements for under 18s and restricted the ability of adults to message children, YouTube videos uploaded by under 18s automatically defaulted to private and ‘bedtime’ reminders were introduced by the streaming platform, whilst TikTok decided to end push notifications for kids after a 10pm ‘watershed’ and also placed curbs on direct messaging for under 18s.

Such measures have been welcomed by UK child safety campaigners and, in the US, have led to cross-party Congressional calls for tech giants to commit to the same standards for young US users of their platforms. Heeding the call, some of the biggest social media companies have applied the changes globally.

Closer to home, the Irish Data Protection Commissioner is consulting on its own ‘Fundamentals for a Child-Oriented Approach to Data Processing’ whilst the French data supervisor, CNIL has published eight recommendations to enhance the protection of children online. Against this backdrop, the ICO’s Code is in the vanguard of a global trend towards tackling some of the internet’s more harmful effects.”

This article was published by Open Access Government on 08/09/2021. You can read the full version on their website.

Julian Hayes advises companies and individuals in the rapidly developing field of data protection, especially in the context of data breaches and law enforcement investigations, where necessary litigating to ensure that the actions of state authorities are properly constrained. A partner at BCL for four years, he has vast experience of all types of criminal inquiries, including the unlawful obtaining of data and computer misuse offences. He is well-known and highly regarded commentator on cybersecurity and privacy issues. He advises telecommunications operators on their obligations under UK investigatory powers legislation and provides practical guidance on how to handle demands placed upon them, including in establishing systems that work to ensure legal compliance and protection for the operator. He has advised in relation to US-UK Bilateral Data Sharing Agreement and forthcoming UK online harms legislation.

Related articles