In April 2019, the UK published an Online Harms White Paper proposing a broad new statutory duty of care for social media companies and platform providers to tackle widespread concerns about a host of online issues, from terrorist and child sexual abuse content to cyber bullying and trolling. More than 18 months on, BCL’s Greta Barkle asks where have the proposals got to?
When first introduced, the Government’s innovative plans received widespread attention. The proposed statutory duty of care would require those affected (said to be fewer than 5% of UK businesses) to take reasonable steps to keep users safe, and prevent others coming to harm as a direct consequence of activity on their services.
Following subsequent public consultation however, legislative momentum appeared to wane. The Government’s only visible progress towards achieving its aim of becoming the “safest place in the world to be online” was an initial consultation response released in February 2020. Criticised as tepid, the response was brief; a summary of the clear themes amongst the 2,400 consultation responses and bland assurances that close attention would be paid to each. Unsurprisingly, at the top of that list of themes, was how the White Paper’s proposals would impact online freedom of expression.
A year and a half after the White Paper’s release, the Minister of State for Digital and Culture in the Department for Digital, Culture, Media and Sport has indicated the Government’s full consultation response will be released “within weeks” with a Bill following early next year.
Ending concerns that enforcement measures in the White Paper were set to be watered down, the Minister firmly denied this would be the case, and indeed promised quite the opposite, with the protection of children being “at the very heart of our approach to tackling online harms.”
The Minister assured listeners that the new online harm regulator would, as planned, have the power to block internet service providers, shutting down social media sites guilty of serious breaches of their statutory duty of care. She also confirmed that senior company managers would be held personally liable for breaches and would be personally subject to sanctions including fines. The Minister stopped short however, of promising that criminal responsibility would be imposed on senior company managers; if absent from the Bill, the omission is likely to be heavily criticised. The NSPCC’s Head of Child Safety Online Policy warned that “if this Bill is to deliver ground-breaking protections for children we need comprehensive sanctions that include GPRD equivalent fines and criminal liability under UK law.”
Breathing down the Government’s neck in the race to claim the moniker “safest place in the world to be online”, the European Union is due to unveil its Digital Services Act package (DSA) on 2 December. The DSA aims to modernise the current EU legal framework for digital services, balancing the safety of users online with the freedom for innovative digital businesses to grow. The DSA’s path to enactment has not be without difficulty though; it has already run into headwinds in the form of tech heavyweights such as Facebook, Google and Apple. Calling for a new Online Responsibility Framework, EDiMA, a European trade association, released a position paper recommending that the DSA avoid regulating online content deemed “harmful” but not “illegal” in the first instance.
In a year blighted by Coronavirus and Brexit negotiations the UK Government can perhaps be forgiven for the long gestation period of its Online Harms legislation. However, with social-distancing measures driving ever-increasing numbers of people, including vulnerable children and adolescents, to the internet, the risk of online harms in all its forms is proliferating. The clamour for the Government to bring forward its legislation swiftly in an effort to stem the problem will only grow.