Ofcom will be given the power to fine social media companies millions of pounds if they fail to combat misinformation under the UK Government’s updated plans for new online safety laws – but news content will be protected.
Every social media platform accessible in the UK, and any other website or app that hosts user-generated content, will have to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content.
The Department of Digital, Culture, Media and Sport (DCMS) said the biggest social media sites would also be expected to set and enforce clear terms and conditions stating how they will handle content that is legal but could cause physical or mental harm to adults.
This includes dangerous disinformation and misinformation about coronavirus vaccines. The Government said the measure would “help bridge the gap between what companies say they do and what happens in practice”.
The DCMS made clear that safeguards for freedom of expression will be built into the legislation, with news publishers’ online journalism and reader comments exempt.
There will also be specific measures to protect journalistic content when it is shared on social media platforms.
It is a win for groups such as the Society of Editors and News Media Association who campaigned for a total exemption for journalistic content to be written into the legislation.
Society of Editors executive director Ian Murray said he was pleased the Government had recognised there is no need to include news media in the legislation with its “adherence to high editorial standards and existing strong regulatory bodies”.
But he added: “…however the devil will be in the detail. The digital platforms when faced with huge fines for non-adherence to the new regulations may resort to the use of sweeping algorithms to remove content deemed as harmful. Such measures must have in-built protection for legitimate media content.”
Culture Secretary Oliver Dowden (pictured) said: “I’m unashamedly pro-tech but that can’t mean a tech free for all. Today Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation.
“We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.
“This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives.”
The Government confirmed broadcast watchdog Ofcom will have its remit extended to cover online regulation with its running costs paid by companies in its scope earning above an as-yet unconfirmed level of global annual revenue.
It will be able to fine companies “failing in their duty of care” up to £18m or 10% of their annual global turnover, whichever is higher, and block non-compliant services from being accessed in the UK.
Google owner Alphabet reported revenue of $162bn in 2019, while Facebook made $71bn.
Senior managers could also be given criminal sanctions if their company fails to take the rules seriously, for example by failing to respond to enquiries from Ofcom fully, accurately or promptly.
Dame Melanie Dawes, who joined Ofcom as chief executive in March, said: “We’re really pleased to take on this new role, which will build on our experience as a media regulator.
“Being online brings huge benefits, but four in five people have concerns about it. That shows the need for sensible, balanced rules that protect users from serious harm, but also recognise the great things about online, including free expression.
“We’re gearing up for the task by acquiring new technology and data skills, and we’ll work with Parliament as it finalises the plans.”
The laws are now expected to be brought in an Online Safety Bill next year.