Tech firms told to hide 'toxic' content from children

A teenage girl looking at her phone
[Getty Images]

Ofcom has warned social media sites they could be named and shamed - and banned for under-18s - if they fail to comply with new online safety rules.

The media regulator has published draft codes of practice which require tech firms to have more robust age-checking measures, and to reformulate their algorithms to steer children away from what it called "toxic" material.

But parents of children who died after exposure to harmful online content have described the proposed new rules as "insufficient" - one told the BBC change was happening "at a snail's pace."

In statements, Meta and Snapchat said they had extra protections for under-18s, and offered parental tools to control what children can see on their platforms.

Other firms have not responded to a BBC request for comment.

Ofcom boss Dame Melanie Dawes said any company that broke the draft codes of practice would be "named and shamed", and she made clear tougher action such as banning social media sites for children would also be considered.

Speaking to BBC Breakfast, Esther Ghey - whose daughter Brianna was murdered, aged 16, by two teenagers in February 2023 - said she believed Ofcom "really did care" about trying to get regulation right.

But she said the full extent of the problem remained unknown.

Lisa Kenevan, whose son Isaac died aged 13 after taking part in a "black out" challenge online, said the pace of change was not fast enough.

"The sad thing is the snail's pace that is happening with Ofcom and social media platforms taking responsibility, the reality is there's going to be more cases," she told BBC Breakfast.

It is Ofcom's job to enforce new, stricter rules following the introduction of the Online Safety Act - these codes set out what tech firms must do to comply with that law.

Ofcom says they contain more than 40 "practical measures."

The centrepiece is the requirement around algorithms, which are used to decide what is shown in people's social media feeds.

Ofcom says tech firms will need to configure their algorithms to filter out the most harmful content from children’s feeds, and reduce the visibility and prominence of other harmful content.

Other proposed measures include forcing companies to perform more rigorous age checks if they show harmful content, and making them implement stronger content moderation, including a so-called "safe search" function on search engines that restricts inappropriate material.

Speaking to BBC Radio 4's Today programme, Dame Melanie described the new rules as "a big moment".

"Young people are fed harmful content on their feed again and again and this has become normalised but it needs to change,” she said.

According to Ofcom's timeline, these new measures will come into force in the second half of 2025.

The regulator is seeking responses to its consultation on the draft codes until 17 July, after which it says it expects to publish final versions of them within a year.

Firms will then have three months to carry out risk assessments on how children could encounter harmful content on their platforms and their mitigations, taking Ofcom's guidance into account.

Dame Melanie added: "We will be publishing league tables so that the public know which companies are implementing the changes and which ones are not.”

Ian Russell, Melanie Dawes and Esther Ghey
Ian Russell, Melanie Dawes and Esther Ghey spoke for nearly half an hour about Ofcom's new measures [BBC]

Dame Melanie met Ms Ghey and Ian Russell, whose daughter Molly took her own life in 2017 at the age of 14.

In 2022, a coroner concluded she died from an act of self-harm while suffering depression and the negative effects of online content.

They are part of a group of bereaved parents who have signed an open letter to Prime Minister Rishi Sunak and leader of the opposition Sir Keir Starmer.

In it, they implore the politicians to do more for the online safety of children - including making "a commitment to strengthen the Online Safety Act in the first half of the next parliament."

They also ask for mental health and suicide prevention into the school curriculum.

"While we will study Ofcom’s latest proposals carefully, we have so far been disappointed by their lack of ambition," they add in the letter.

'Step up'

The government insists the measures announced by Ofcom "will bring in a fundamental change in how children in the UK experience the online world."

The Technology Secretary Michelle Donelan urged big tech to take the codes seriously.

"To platforms, my message is engage with us and prepare," she said.

"Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now."

Bruce Daisley, the former UK boss at Twitter and YouTube, told BBC Radio 5 Live Breakfast the tech for checking the ages of under-18s would need to improve for the proposals to work.

“The challenge of course is identifying who young users are - so the impact for all of us is that age verification is going to step up a notch," he said.

Most of the tech companies contacted by the BBC did not reply or declined to comment on the record.

A Snapchat spokesperson said: “As a platform popular with young people, we know we have additional responsibilities to create a safe and positive experience," said

"We support the aims of the Online Safety Act and work with experts to inform our approach to safety on Snapchat.”

And a Meta spokesperson said the firm wanted young people "to connect with others in an environment where they feel safe".

"Content that incites violence, encourages suicide, self-injury or eating disorders breaks our rules and we remove that content when we find it," they said.

Advertisement