Tech firms to be forced to combat 'tsunami of online child abuse' by Online Safety Bill amendment

Meta, which owns Facebook, WhatsApp and Instagram, has announced plans to effectively lock Facebook Messenger and Instagram direct messages using end-to-end encryption, a technology which keeps conversations secure, but can also make them inaccessible for anyone trying to keep them safe.

New legislation will give regulators the power to force technology companies to stop sexual abuse of children on their platforms.

The amendment to the Online Safety Bill, which was announced today by the Home Office, will allow Ofcom to demand that big tech firms such as Facebook and Google use their "best endeavours" to prevent, identify and remove child sexual abuse.

The move was welcomed by the National Society for the Prevention of Cruelty to Children (NSPCC), which said it would help stem what it called a "tsunami of online child abuse".

The amendment is a small but significant strengthening of the powers of Ofcom, which will become the regulator for tech and social media if the proposed Online Safety Bill becomes law.

It will let Ofcom insist on proof that child sexual abuse is being tackled, even if the technology behind the platform changes.

Meta, which owns Facebook, WhatsApp and Instagram, has announced plans to effectively lock Facebook Messenger and Instagram direct messages using end-to-end encryption, a technology which keeps conversations secure, but can also make them inaccessible for anyone trying to keep them safe.

Pros and cons of encryption


Home Secretary Priti Patel condemned Meta's encryption plans in the strongest possible terms, calling them "morally wrong and dangerous", and law enforcement agencies such as Interpol and the UK's National Crime Agency (NCA) have criticised the technology.

But Whitehall officials insist that they are not against encryption itself, just the problems it poses for law enforcement agencies and police forces, which need direct evidence of involvement with child sexual abuse to start investigations and make arrests.

Last year, the Internet Watch Foundation successfully blocked 8.8 million attempts by UK internet users to access videos and images of children being abused.

Faced with exploitation on this scale, officials argue that they must at the very least maintain their current level of access, which relies on the tech companies reporting instances of abuse to the authorities.

The case of David Wilson, for instance, who posed as girls online to elicit sexually explicit images from young boys, was started after a report from Meta. Wilson was jailed for 25 years in 2021 after admitting 96 offences.

Convicted paedophile David Wilson


The new law will give Ofcom the power to insist that tech companies both inside and outside the UK to identify and take down child sexual abuse content, potentially giving the UK regulator the authority to break encryption globally.

However, officials argue that this does not mean apps and other services cannot be encrypted, saying that technologies exist that can give police forces access to the material they need without compromising privacy.

The new law will require tech companies to take action on child sexual abuse "where it is proportionate and necessary to do so", giving Ofcom the ability to balance security for users and security for children.

Yet while this move may sound like a peace settlement on the vexed issue of encryption, it might not spell the end of conflict.

'tsunami of online child abuse'


Attempts by Apple to scan iPhone images for known child sexual abuse imagery were delayed last year after an outcry by privacy campaigners.

The system, called NeuralHash, was designed to identify images in a privacy-protecting way by doing the analysis locally on the phone rather than in Apple's data centres, but privacy campaigners argued that the software could be abused by governments or authoritarian states.

Whitehall officials say the fears are overblown, pointing to the results of the Safety Tech Challenge Fund, a government-funded collaboration with industry to produce technology that can "keep children safe in end-to-end encrypted environments" - such as an algorithm that turns the camera off automatically when it detects the filming of nudity.

The announcement of the change to the legislation comes as police data obtained by the NSPCC showed what the charity described as a "tsunami of online child abuse".

Freedom of Information requests filed by the charity revealed that Sexual Communication with a Child offences had jumped by 80% in four years, rising to 6,156 in the last year on record - an average of almost 120 offences a week.

Sir Peter Wanless, the chief executive of the NSPCC, welcomed the change to the Online Harms Bill, saying it would strengthen the protections around private messaging.

"This positive step shows there doesn't have to be a trade-off between privacy and detecting and disrupting child abuse material and grooming," he told Sky News.

×