Internet sites are set to face significant fines – or even be blocked entirely – for failing to take down harmful content under new government plans.
The Department for Digital, Culture, Media and Sport (DCMS) and the Home Office have made a number of joint proposals for content sharing platforms, such as Twitter, Facebook and Instagram, published in an ‘Online Harms White Paper‘ on Monday.
Suggestions include setting up an independent regulator that will write a strict “code of practice” for tech companies in the UK.
Senior managers at companies could be held responsible for breaches of the code.
“The era of self-regulation for online companies is over,” Culture Secretary Jeremy Wright told BBC Breakfast on the White Paper.
“Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.
“If you look at the fines available to the Information Commissioner around the GDPR rules, that could be up to 4 per cent of company’s turnover… we think we should be looking at something comparable here.”
A levy on the tech industry to fund the regulator has also been proposed.
A public consultation on the plans will now run for 12 weeks.
What Sort Of Content Do New Proposals Intend On Banning?
Government plans suggest monitoring content-sharing platforms to make sure they are taking down and blocking posts that cover issues already defined as criminal under UK law.
Image credit: Facebook/Pexels
This includes content that encourages hate crimes, terrorism, portrays child sex abuse, so-called ‘revenge porn’, harassment, material that encourages self-harm, as well as the sale of illegal items, such as drugs or weapons.
It also covers the spread of fake news, trolling and cyber-bullying, which are less well defined by UK law.
What Will The ‘Code Of Practice’ Be?
The exact details of the new code are yet to be decided – and will be defined in full by the regulatory body.
However, suggestions include preventing the spread of fake news by forcing social networks to employ fact-checkers and prioritise the promotion of legitimate news sources.
Social media companies may be required to produce annual reports on harmful content found on individual platforms too.
What Are The Critics Saying About The New Proposals?
There are a number of issues this strict new turn in direction has thrown up, not least around the censorship of some materials – and whose job it will be to decide which material is considered harmful, and which isn’t.
Misinformation in particular has been listed as potentially harmful, but how will the regulator instruct companies on what needs to be removed under a duty of care and what doesn’t?
And who will the regulator be? Will they be partisan, or completely independent from government and politics? How will unconscious bias be tackled by those regulators, and what sort of sanctions will they be able to impose on those companies that breach the code of practice?
Former culture secretary John Whittingdale MP expressed his concerns about the new measures in an article for the Mail On Sunday.
Image credit: John Whittingdale MP/WikiCommons
“With their ‘duty of care’, well-meaning ministers want the same laws to apply online as offline – but they risk dragging British citizens into a draconian censorship regime instead,” he wrote.
“However, countries such as China, Russia and North Korea, which allow no political dissent and deny their people freedom of speech, are also keen to impose censorship online, just as they already do on traditional media.
“This mooted new UK regulator must not give the despots an excuse to claim that they are simply following an example set by Britain, where civil liberties were first entrenched in Magna Carta 800 years ago.”
What Have Social Media Networks Said About The White Paper?
Facebook’s head of UK policy Rebecca Stimson stressed the need for a uniform approach, while also maintaining free speech.
“New regulations are needed so that we have a standardised approach across platforms and private companies aren’t making so many important decisions alone,” she said.
“New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.”
Image credit: Facebook CEO Mark Zuckerberg/ Wikimedia
Meanwhile Katy Minshall, Twitter’s head of UK public policy, said: “We look forward to engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the internet.”
Jim Killock, the executive director of Open Rights Group, a UK-based organisation that works to preserve digital rights and freedoms, fears that the proposals would “create state regulation of the speech of millions of British citizens”.
Tech UK, an umbrella body that represents the UK’s technology industry, added that the government must be clear about “how trade-offs are balanced between harm prevention and fundamental rights”.