Tech firms must start protecting UK users from illegal content

LONDON, (Reuters) – Tech companies must start putting in place measures to protect users from child sexual abuse images and other illegal content in Britain from Monday as enforcement of its online safety regime ramps up.

Media regulator Ofcom said Meta’s Facebook, ByteDance’s TikTok, Alphabet’s YouTube and other companies must now implement measures such as better moderation, easier reporting and built-in safety tests to tackle criminal activity and make their platforms safer by design.

“Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that,” Ofcom’s enforcement director Suzanne Cater said.

The Online Safety Act, which became law in 2023, sets tougher standards for platforms, with an emphasis on child protection and the removal of illegal content.

In December, Ofcom published its first codes of practice for the new law and set companies a deadline of March 16 to assess the risks illegal content posed to users on their platforms.

The regulator will be able to issue fines of up to 18 million pounds ($23.31 million) or 10% of a company’s annual global turnover, if they fail to comply with the law.

Ofcom said file-sharing and file-storage services were particularly vulnerable to being used for sharing child sexual abuse material.

It launched a separate enforcement programme on Monday to assess the safety measures these services had in place to prevent the spread of such content.

The media regulator said it requested a number of firms that offer file-storage services to share their risk assessments by March 31. Should they not comply with these rules, they could face the same penalties.

($1 = 0.7721 pounds)

Reporting by Sam Tabahriti, Editing by Paul Sandle and Louise Heavens