Pleasing almost nobody, the U.K.’s media regulator has warned that social media companies must stop their algorithms recommending harmful content to children, and put in place ‘robust’ age-checks.

Ofcom has issued a set of more than 40 proposed measures designed to protect children under the Online Safety Act.

“In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes.

“Our measures—which go way beyond current industry standards—will deliver a step-change in online safety for children in the UK. Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account.”

The draft code requires much greater use of highly-effective age-assurance; in some cases, says Ofcom, this will mean preventing children from accessing the entire site or app, while in others it might mean restricting parts to adults only, or removing children’s access to identified harmful content.

There are also new rules for recommender systems – algorithms providing personalized recommendations to users. Services using these will have to use those algorithms to filter out the most harmful content from children’s feeds, and make other harmful content less visible and prominent.

Social media firms must also improve their content moderation: and while there’s no detail on how they are to do this, they are being told that they must ensure swift action is taken against content harmful to children.

Where a user is believed to be a child, large search services must implement a ‘safe search’ setting which can’t be turned off and that must filter out the most harmful content.

“The government assigned Ofcom to deliver the Act and today the regulator has been clear; platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms which too readily mean they come across harmful material online,” said technology secretary Michelle Donelan.

“To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines—step up to meet your responsibilities and act now.”

The code has been welcomed by the National Society for the Prevention of Cruelty to Children, which has been campaigning for tighter restrictions on social media companies for some time.

“Importantly, this draft code shows that both the Online Safety Act and effective regulation have pivotal roles to play in ensuring children can access and explore the online world safely,” said NSPCC CEO Peter Wanless.

“We look forward to engaging with Ofcom’s consultation and will share our safeguarding and child safety expertise to ensure that the voices and experiences of children and young people are central to decision-making and the final version of the code.”

However, others have criticized the draft code for not going far enough, with the parents of two children who died after taking part in online ‘challenges’ telling Sky News that they felt ‘belittled’ by a lack of consultation and the slow pace of change.

Meanwhile, digital rights campaigners aren’t happy either, with the Open Rights Group warning that the proposals threaten both freedom of speech and security.

“Adults will be faced with a choice: either limit their freedom of expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites,” says executive director Jim Killock.

“Some overseas providers may block access to their platforms from the UK rather than comply with these stringent measures.”

And, he adds, “We are also concerned that educational and help material, especially where it relates to sexuality, gender identity, drugs and other sensitive topics may be denied to young people by moderation systems.”

Ofcom says its consultation is open until 17 July, with a final statement and documents to be published in spring next year.

Share.
Exit mobile version