The Online Safety Bill has passed its final parliamentary hurdle in the House of Lords, meaning it will finally become law after years of delay.
The flagship legislation will force social media companies to remove illegal content and protect users, especially children, from material that is legal but harmful.
The idea was conceived in a white paper in 2019, but turning it into law has been a long and rocky road – with delays and controversies over issues such as freedom of expression and privacy.
Perhaps most controversially, one of the proposals would force platforms such as WhatsApp and Signal to undermine message encryption so that private chats could be checked for criminal content.
Technology Secretary Michelle Donelan said: “The Online Safety Bill is a game-changing piece of legislation. Today this Government is taking a huge step forward in our mission to make Britain the safest place in the world to be online.”
Social media bosses face jail time
The bill would require social media companies to remove illegal content quickly or prevent it from appearing in the first place, including content that promotes self-harm.
Other illegal content it wants to crack down on includes the sale of drugs and weapons, inciting or planning terrorism, sexual exploitation, hate speech, fraud and revenge porn.
Communications regulator Ofcom will be largely responsible for enforcing the bill, with social media bosses facing billions of pounds in fines or even imprisonment if they fail to comply.
The bill has also created new criminal offences, including cyber-flash and sharing “deepfake” pornography.
The legislation has received wide support from charities such as the NSPCC, the safety group Internet Watch Foundation (IWF), bereaved parents who say harmful online content contributed to their child’s death, and survivors of sexual abuse.
However, there has been concerns in the Tory Party that it is simply too far-reaching, potentially threatening freedom of expression online.
‘Instant Day for Kids’
Tech companies criticized proposed rules to regulate legal but harmful content, suggesting it would make them unfairly liable for material on their platforms.
Ms Donelan removed this measure from the bill in an amendment last year which said that instead of platforms remove legal but harmful content, they will have to give adults tools to hide certain material they don’t want to see.
This includes content that does not meet the criminal threshold but may be harmful, such as glorification of eating disorders, misogyny and some other forms of abuse.
But after backlash from parents, she backed the bill still tasks companies with protecting children from not only illegal contentbut any material that may “cause serious trauma”, such as cyberbullying, by enforcing age limits and age control measures.
NSPCC Chief Executive Sir Peter Wanless said: “We are absolutely delighted to see the Online Safety Bill passed through Parliament. It is a momentous day for children and will finally result in the ground-breaking protection they can expect online.”