Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Social media groups, search engines and messaging apps will next week be told to bring in strict measures to remove illegal material quickly and reduce the risk of such content, under new rules from the UK media watchdog.
Ofcom will begin enforcing the new rules designed to protect internet users from illegal content and harmful activity online. Regulators and lawmakers want additional powers to curb the rise of the kind of extreme and false information that provoked violent unrest last summer following the mass stabbing in Southport.
Under the UK’s Online Safety Act (OSA), tech companies needed to complete compulsory illegal content risk assessments by this weekend to understand how likely it was for users to encounter illegal content on their service.
In the case of “user-to-user” messaging services, this includes how they could be used to commit or facilitate criminal offences.
So-called priority illegal content covers 17 categories from terrorism, child sexual abuse and encouraging or assisting suicide, to stalking and drugs, crime and fraud offences.
Ofcom will from next week start assessing platforms’ compliance with the new illegal harms obligations under the OSA, and begin enforcement action where there are issues and failures to comply. The OSA was passed by parliament in 2023, but is being implemented in phases this year and next.
Sites and apps will need to start implementing safety measures to mitigate risks, with a senior executive named as the staff member accountable for compliance and better moderation, easier reporting and built-in safety tests.
Under the new rules, tech companies will have to ensure their moderation teams are resourced and trained, and set performance targets to remove illegal material quickly when they become aware of it. Platforms will need to test algorithms to make illegal content harder to disseminate.
Ofcom will initially prioritise larger sites and apps that may present particular risks of harm from illegal content owing to their size or nature, for example because they have a large number of users in the UK, or because their users may risk encountering some of the most harmful forms of online content and conduct.
Suzanne Cater, enforcement director at Ofcom, said: “Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action.”
Linklaters, the British law firm, described the new rules as “the first big regulatory deadline” under the OSA. Companies can be fined up to £18mn or 10 per cent of their qualifying worldwide revenue, whichever is greater.
Ben Packer, partner at Linklaters, said: “We’re going to find out quite quickly who’s engaged with this properly and done a thorough job of the risk assessments. I suspect there may well be some companies in scope who haven’t done much at all.”
Packer added that the threat by Ofcom of intervening may outweigh the financial imposition, noting that the regulator could order companies “to take additional measures when it comes to content moderation or user reporting or the technology deployed to detect content”.
“Our experience in other sectors is those kinds of interventions in company operations that tend to be the more impactful,” he said.
Credit: Source link