Communications regulator Ofcom has laid out practical steps for tech firms to ensure children are better protected online under the Online Safety Act (OSA).
Ofcom announced in April that it was preparing to launch a consultation on its draft children’s safety code of practice, under the OSA. This consultation opened today with measures that, according to Ofcom, will deliver a step-change in online safety for children in the UK.
“Once these measures are in force we won’t hesitate to use our full range of enforcement powers to hold tech firms to account. That’s a promise we make to children and parents today,” said Dame Melanie Dawes, Ofcom chief executive.
The consultation follows the results of Ofcom’s annual study published in April, which revealed the extent to which young children are navigating social media apps and online sites, often without any adult supervision.
Under these measures, it will be the responsibility of tech firms to ensure that their platforms’ fundamental design and operating choices protect children from exposure to harmful content, including suicide, self-harm, eating disorders and pornography.
They must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.
Ofcom’s consultation proposes more than 40 safety measures that tech firms will need to take to ensure their platforms are child-safe. These include robust age checks, algorithms that limit the risks to younger users, default safety settings and content moderation systems that quickly act on harmful content.
“In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Dawes.
“Tech firms will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age checks so children get an experience that’s right for their age.”
Michelle Donelan, technology secretary, said: “I want to assure parents that protecting children is our number one priority and these laws will help keep their families safe. To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”
This is the second major consultation that Ofcom, as regulator of the OSA, has published as part of its work to establish the new regulations.
OSA has been described as a “landmark” law aimed at preventing the spread of child sexual abuse material, terrorism content and fraud. According to Ofcom, its first priority will be protecting children.
The consultation period for this set of proposals to protect children from harms online will close on 17 July 2024. Thereafter, Ofcom will finalise the proposals with the aim of publishing its final codes of practice within a year.
Ofcom will also launch an additional consultation later this year on how automated tools, including artificial intelligence, can be used to proactively detect illegal content and content most harmful to children – including previously undetected child sexual abuse material and content encouraging suicide and self-harm.