NSPCC lays out six tests for Government to create world-leading laws to protect children online

  • Charity at the forefront of the Online Harms Bill urges Government to deliver on Boris Johnson’s determination for ambitious regulation
  • NSPCC sets 11th hour demand for Government: ‘Pass our tests for online regulation so children don’t continue to suffer avoidable harm and abuse’
  • Online sex crimes recorded against children in Sussex surpass two a day; as Ian Russell backs calls for Bill to also tackle suicide and self-harm posts

The NSPCC has laid out six tests the Government’s regulation of social media will be judged on if it is to achieve bold and lasting protections for children online.

The charity’s How the Wild West Web should be won report, released today, sets out how the upcoming Online Harms Bill must set the global standard in protecting children on the web.

With crucial decisions just days away, they are urging Government to ensure they level the playing field for children, and new laws finally force tech firms to tackle the avoidable harm caused by their sites.

The call comes as new analysis of the latest ONS data shows the number of online sex crimes against children recorded by Sussex Police reached the equivalent of more than two a day between January and March this year, highlighting the sheer scale of web abuse.

Across England and Wales, that figure stood at 101. The NSPCC expects this to have increased during lockdown, with coronavirus resulting in significant online harms to children driven by a historic failure to make platforms safe, by not putting even the most basic child protections in place.

At the Hidden Harms summit earlier this year, the Prime Minister signalled his personal determination to legislate for ambitious regulation that successfully combats child abuse.

But the NSPCC is worried the landmark opportunity to change the landscape for children online could be missed if this isn’t translated by Government into law.

They have released their six tests ahead of a full consultation response to the White Paper, amid concerns Ministers are wavering in their ambitions for robust regulation.

The charity believes, if done correctly, regulation could set a British model that leads the world in child protection online.

But in a stark warning, NSPCC CEO Peter Wanless, said that “failing to pass any of the six tests will mean that rather than tech companies paying the cost of their inaction, future generations of children will pay with serious harm and sexual abuse that could have been stopped”.

The pandemic is likely to result in long-term changes to the online child abuse threat, with high-risk livestreaming and video chat becoming more popular. Changes to working patterns, meaning more offenders working at home, could result in a greater demand for sexual abuse images and increased opportunities for grooming.

Mr Wanless added: “Industry inaction is fuelling this staggering number of sex crimes against children and the fallout from coronavirus has heightened the risks of abuse now and in the future.

“The Online Harms Bill must become a Government priority, with unwavering determination to take the opportunity to finally end the avoidable, serious harm children face online because of unaccountable tech firms.”

The six tests are backed by Ian Russell, who has campaigned for regulation since the death of his daughter, Molly, by suicide, after she was targeted with self-harm posts on social media.

The six tests the Government must pass if it is to create game-changing and lasting protections for children online are:

  • An expansive, principles-based duty of care; tech firms should have a legal responsibility to identify harms caused by their sites and deal with them, or face tough consequences for breaching regulation.
  • Tackling online sexual abuse; platforms must proactively and consistently tackle grooming and abuse images facilitated by dangerous design features. There must be no excuses. In the current state of play abuse images have been left online with the excuse that a child’s age cannot be proven, and images signposting abuse are not removed.
  • Tackling legal but harmful content; current Government proposals will see companies set their own rules on legal but harmful content. This is not good enough. The law must compel firms to respond to the harms caused by algorithms targeting damaging suicide and self-harm posts at children and avoid a two-tier system that prioritises tacking illegal content. The danger of harmful content should rightly be balanced against freedom of expression, but focus on the risk to children.
  • Transparency and investigation powers; tech firms currently only dish out information they want the public to see. The regulator must have the power to lift up the bonnet to investigate platforms and demand information from companies.
  • Criminal and financial sanctions; fines are vital but will be water off a duck’s back to some of the world’s wealthiest companies. Government can’t backslide on a named manager scheme that gives the regulator powers to prosecute rogue tech directors in UK law.
  • User advocacy arrangements; to level the playing field there must be strong civil society voice for children against well-resourced industry pressure. Big tech should be made to clean up the damage they have caused by funding user advocacy arrangements.

The NSPCC has been the leading voice for social media regulation and the charity set out detailed proposals for an Online Harms Bill last year, which informed much of  the White Paper.

The Government has said the consultation response will be published in the autumn, with legislation expected to be delivered in the new year.