The regulator’s guidance on protecting minors online comes at a time when some governments are considering prohibiting access to platforms altogether
Ofcom published its second major consultation on the implementation of the Online Safety Act, focused on child safety online
On 8 May 2024, Ofcom published its draft Children’s Safety Codes of Practice for public consultation. The proposed guidelines are part of the second phase of the regulator’s ongoing work to implement the Online Safety Act and represent the bulk of the planned effort to enforce the child safety provisions in the law. The Children’s Safety Codes follow on Ofcom’s Codes of Practice for illegal harms published in November 2023 and precede its code on obligations for categorised services expected at the beginning of 2025. The consultation on the draft measures will run until 17 July 2024, and the regulator is aiming for finalised codes on child safety to enter into force in Q3 of 2025.
Proposed guidance is targeted based on the size, risk profile and functions of platforms as well as the nature of the harmful content
In targeting its different Codes of Practice to the many types of platforms captured by the Online Safety Act, Ofcom has tailored its guidance to services based on their size, risk profile and purposes or core functions. Platforms are broadly separated into user-to-user services, such as social media, and search services. Types of harmful content are also divided into illegal and legal content types and assigned different levels of priority which carry increasingly stringent obligations based on the risk they pose. In its initial Codes of Practice on illegal harms, Ofcom introduced a number of baseline measures related to governance, accountability, user control and content moderation recommendation systems which were reiterated in its subsequent Children’s Safety Codes.
The Children’s Safety Codes of Practice introduce age assurance, recommendation system obligations specific to child protection
While many of the Codes’ proposed measures correspond to those already included in the illegal harms Codes of Practice, Ofcom included 12 new measures specific to the protection of children online, including:
Employing “highly effective” age assurance techniques when deploying child protection features;
Limiting the likelihood that harmful content is recommended to children and reducing the prominence of harmful content in children’s feeds; and
Providing additional, age-appropriate support features to children at various points of engagement with the platform.
With the introduction of age assurance requirements, Ofcom wades into one of the most hotly debated examples of the debate on user privacy versus child protection online. Though the regulator responds directly to critiques of the effectiveness of age assurance by requiring methods be evaluated for technical accuracy, robustness, reliability and fairness, their guidance does little to address privacy concerns beyond restating existing data protection laws and asserting that children should not be denied “the benefits of being online and enjoying the opportunities that services present”.
The implementation of the Online Safety Act comes at a time of increased anxiety around children’s exposure to any form of social media
Ofcom’s child safety Codes come at a time of increased concern around the world for the harms children face from smartphone and social media usage. On 14 May 2024, the House of Commons hosted a Westminster Hall debate on the topic, and a Government consultation on limiting or prohibiting smartphone access for children is expected soon. The UK is not alone in these concerns, as a number of countries are engaging in similar public debates. In France, a report published in April 2024 by an expert panel commissioned by President Emmanuel Macron urged that smartphones, or any phones with internet connectivity, be banned for anyone under the age of 13. A report issued by UNESCO in 2023 also recommended a global ban on smartphones in schools, a proposal picked up by parents, educators and government officials in Spain, Ireland, the Netherlands and South Korea. Though many of the emerging legislative frameworks for online safety do include measures specifically designed to protect children, proponents of limiting or prohibiting minors' access to smartphones appear to suggest that no level of regulatory intervention can address the full extent of the alleged harm to children from social media.