Online Safety Act 2023: Children Codes published by Ofcom

06 May 2025. Published by Rupert Cowper-Coles , Partner and Mafruhdha Miah, Senior Associate

On 24 April 2025, Ofcom published the Protection of Children Codes and Guidance (the Codes), as part of the second phase of its three-phase process to implement the Online Safety Act 2023 (the Act) (see the full roadmap here). In-scope service providers are now required to complete their first children's risk assessments by 24 July 2025, and subject to parliamentary approval of the Codes, those service providers will need to comply with the Codes from 25 July 2025.

Which services are in scope of the Codes?

All services who provide "user-to-user services" and/or "search services" as defined by s3 of the Act were required to complete a Children's Access Assessment by 16 April 2025. The Access Assessment comprised of a two-stage test to determine whether a service or part of a service is likely to be accessed by children. If the Access Assessment determines that children are likely to access the service or part of it, the relevant service provider is deemed to be within the scope of the Codes and must comply with them.

Risk assessments

Service providers who are within scope must conduct risk assessments to understand and identify the kinds of content harmful to children, which need to be separately assessed to identify risk factors relevant to the service for each kind of content harmful to children. Based on this information, service providers should then determine how likely it is that children will or may encounter such harmful content and conclude whether their services are at negligible, low, medium or high risk for each kind of content.

Once service providers have identified their risk level, they must consult the Codes and consider the recommended measures to be taken to mitigate and manage those risks for child users.

What measures do the Codes require in-scope service providers to implement?

The Codes take a proportionate approach when recommending measures to be implemented, acknowledging that not all services pose the same level of risk. Different measures in the Codes apply to different types of services, taking into consideration the type of service provided, the relevant functionalities and characteristics of the service, the number of users the service has and the outcome of the service's latest children's risk assessment. Additionally, some measures involve using age assurance to ensure safety measures can be implemented without prejudicing adults' right to access legal content in the UK.

Over 40 safety measures have been proposed in the Codes. For both user-to-user and search service providers, measures may relate to governance and accountability of the management of risks relating to children, effective reporting and complaints mechanisms for users, and settings and functionality to provide users with more control over what content they see. However, service providers are not compelled to take the recommended measures set out in the Codes, and instead may take alternative measures which must be sufficient to mitigate and manage risks of harm to children and should be appropriately recorded together with justification of how those measures are considered to fulfil the relevant duties under the Codes.

Some of the most significant measures under the Codes relate to recommender and content moderation systems for service providers, designed to tackle certain categories of content, pursuant to ss61 and 62 of the Act. The most harmful is labelled "primary priority content" (PPC), which includes suicide, self-harm, eating disorder content and pornography, and the next is labelled "priority content" (PC), which includes abusive content, hateful content, bullying, violent content, harmful substances and dangerous stunts and challenges. Additionally, the Codes recognise a third category, "non-designated content" (NDC), covering otherwise uncaptured content which presents a material risk of significant harm to an appreciable number of children. Ofcom has indicated that this latter category includes body image and depressive content.

Service providers whose terms of service do not prohibit one or more kinds of PPC should apply content or access controls to ensure that children are "prevented" from encountering PPC, using highly effective age assurance measures to target the content and ensure that it can only be seen by adults. For PC or NDC, service providers are not required to use age assurance mechanisms to exclude this content from children but instead should take swift action to "protect" children from encountering this content, such as giving it lower priority, obscuring, blurring or distorting it, applying overlays or interstitials, or excluding it from content recommender feeds altogether.

What next?

At a Digital Regulation Group meeting earlier this week, Ofcom noted that the Government is keen to get the Codes into force with a view to making any necessary amendments at a later stage, and so Ofcom expects that the Codes will be given Parliamentary approval shortly. In-scope service providers should therefore proceed on the basis that the Codes will be approved and in force by 25 July 2025, and so should begin considering the risk assessments which need to be undertaken.

Helpfully, Ofcom has published a number of documents to assist service providers with undertaking the risk assessments, including this guidance and have suggested that their digital toolkit (which assisted providers with completing their risk assessments for Illegal Harms) will be updated to address these Codes.

If you have specific questions on the OSA, please contact Rupert Cowper-Coles or Mafruhdha Miah.

Stay connected and subscribe to our latest insights and views 

Subscribe Here