Tough new laws that would protect children from being bombarded with seriously harmful online material, will be introduced as a top priority of a Labour government.
After meeting families who have lost children as a result of exposure to harmful content, the shadow culture secretary, Lucy Powell, has won the backing of party leader Keir Starmer to legislate as one of the first acts of a Labour government, if the party wins the next election.
The move by Labour comes weeks after the Conservative government ditched plans to in effect outlaw online material that is judged as “legal but harmful”, and dropped proposals to make platforms such as Facebook and Instagram liable for significant financial penalties for breaching regulations.
Labour said it would attempt to amend the online safety bill to something closer to its original form when it returns to parliament in just over two weeks’ time.
But if it failed, Powell said Labour would legislate as soon as possible to address problems with “legal but harmful” material, impose tough new criminal sanctions on those responsible for promoting damaging content, and create a new ombudsman to adjudicate.
In particular, Labour wants to prevent the use of algorithms and online business models that bombard young people with harmful material on issues such as suicide and self-harm. In November 2017, the death of Molly Russell highlighted the tragic consequences of such exposure.
Powell said on Saturday: “I met many of the families who have lost teenagers from online activity, and I promised them we would act. We owe it to all those who have been harmed online to get this right.
“Regulation of the online world is urgently needed but, instead of strengthening the online safety bill, the government has weakened, gutted and delayed it.
“The weakened bill will give abusers a licence to troll, and the business models of big tech will give these trolls a platform.”
A spokesperson for the Molly Rose Foundation, a charity set up by friends and family after Molly Russell’s death, said: “We welcome this development, as Molly’s case demonstrates clearly how legal but harmful content amplified by income-generating algorithms can have fatal consequences for vulnerable people.
“It is unsatisfactory that this bill has still not been passed into law. It has taken far too long to get to this point with legislation already being watered down, and this risks failing our children and young adults.”
Announcing the changes to the bill to the Commons on 5 December, the digital and culture secretary, Michelle Donelan, said that the safety of children was at its heart. But she acknowledged that concerns in her own party about threats to free speech had persuaded her to drop the provisions on “legal but harmful” material.
“They would have meant that the government were creating a quasi-legal category – a grey area – and would have raised the very real risk that to avoid sanctions, platforms would carry out sweeping take-downs of content, including legitimate posts, eroding free speech in the process,” she told MPs.
Donelan added that the bill as now progressing through parliament offered enhanced protection nonetheless.