Online safety: the encryption dilemma – trade-offs

Online safety: the encryption dilemma – trade-offs

From e-commerce and video-conferencing to messaging friends and colleagues, we take the encryption, and thus the security, of our digital communications for granted. However, while it ensures our privacy, it is also exploited by criminals to evade detection, for example those plotting terrorist atrocities or exchanging child sexual exploitation and abuse material (CSEA). The dilemma – whether to maintain privacy or tackle crime facilitated online – underlies the opposing and often stridently expressed views about encryption. As messaging platforms roll-out end-to-end encryption (E2EE), where not even service providers can decipher messages sent over their systems, law enforcement agencies have sought to preserve their covert ability to observe our communications. The UK’s latest proposals, in amendments to the government’s flagship Online Safety Bill, have aroused fierce industry and privacy group opposition. The ongoing difficulty in resolving the privacy versus safety conundrum in part arises from a failure to level with the public about the trade-offs involved.

Invented in the 1970s, computer-based encryption has become ubiquitous, its roll-out taking place in the shadow of the 2013 Snowden revelations of covert bulk interception of electronic communications by the US and UK, and more recently supercharged by fear of ransomware attacks and other computer misuse. In response to an ensuing sense of data insecurity, major messaging platforms such as WhatsApp, iMessage and Telegram implemented E2EE by default. It is also an opt-in feature of Facebook Messenger but Meta’s plans to make it standard by 2023 have aroused intense political hostility, with lurid descriptions of the company as an “enabler of abuse”, and prompting a government-funded media blitz targeting Meta’s largest investors, leveraging their reputational sensitivity to try to pressurise the tech giant to cancel its E2EE plan. The media campaign attracted derision from privacy groups and disapproval from the UK’s data watchdog, the Information Commissioner’s Office (ICO). The ICO refused to accept a binary choice between E2EE and online safety, and highlighted the technology’s security benefits both for children and businesses, asserting, “it is vital that one form of online safety is not traded off for another.”

Battle lines drawn

Amidst this troubled landscape, and just before parliamentary business was effectively paused pending the installation of a new Prime Minister in September, the government proposed amendments to its Online Safety Bill which many commentators and interest groups saw as jeopardising E2EE. The amendments would give the online safety regulator, OFCOM power to issue Notices requiring platform providers to use “use their best endeavours” to develop / source and deploy technology to identify and prevent users encountering terrorist and CSEA content on their platforms. OFCOM would judge whether a company had complied with a Notice, and the obligation to do so was reinforced with the threat of enforcement action, including financial penalties of up to 10% of an offending company’s global turnover.

Shortly after the government proposed the amendment, the increasingly complex Online Safety Bill, once trumpeted by the government as “a system of regulation the world will want to emulate” was placed on hold until autumn 2022. Despite the Bill being packed off on a summer break, the proposed amendments sparked an immediate hostile reaction. Perceiving a threat to WhatsApp’s E2EE and the company’s highly-prized reputation for privacy, its CEO threatened to disable the chat service in the UK completely rather than to buckle to politicians’ demands. Likewise, Meta effectively issued a riposte by announcing that it would shortly begin testing E2EE as the default option for its Messenger service.

Despite the recent outcry, the government’s proposals in fact have progenitors outside the UK. In 2018, Australia passed the Telecommunications (Assistance and Access) Act enabling the authorities to compel providers to create backdoors into E2EE platforms for the investigation of offences carrying penalties of three or more years’ imprisonment. Earlier this year, the European Commission proposed an EU-wide regulation controversially mandating the detection, reporting and removal of CSEA material by hosting services and interpersonal communication service providers. In 2021, Apple itself floated but then postponed a plan to scan images in the iCloud and on user’s devices for CSEA material and report them to the US authorities.

Moreover, the proposals in the Online Safety Bill resemble existing technical capability notice (TCN) provisions under the UK’s Investigatory Powers Act 2016 and its accompanying Technical Capability Regulations. Those require that, so far a reasonably practicable, telecommunications operators maintain the capacity to disclose the content of communications if required. Parliamentary debates on the TCN provisions never wrung from the government an unequivocal denial that they could be used to undermine E2EE. However, before a TCN is imposed, it must be adjudged ‘necessary’ and ‘proportionate’, and must be independently approved by a Judicial Commissioner. By contrast, the Online Safety Bill proposals lack such safeguards, and the proposed ‘best endeavours’ requirement imposes a qualitatively heavier compliance burden on service providers than meeting the ‘reasonably practicable’ threshold under the investigatory powers legislation.

Client side scanning – a silver bullet?

The government denied the powers proposed in the Online Safety Bill could force platform providers to compromise E2EE on their services, but it is widely believed that the amendment paves the way for law enforcement to roll-out a surveillance technique known as client side scanning (CSS). CSS involves the downloading of software onto individual devices such as a smartphones, tablets and computers to conduct algorithmic scanning of text, images, videos and files for prohibited content before it is sent via the device. Where prohibited material is discovered, the CSS software may prevent it from being sent and may alert third parties such as the police.

Proponents of CSS hail it as a minimally intrusive technological solution which protects the public, avoids traditional security concerns about secret backdoors into encrypted communications via ‘ghost protocols or ‘escrow keys’, leaves E2EE intact, and reconciles the competing demands of law enforcement and privacy campaigners. Opponents of CSS counter that it constitutes an insidious form of bulk surveillance, effectively placing bugs in everyone’s pockets and altering the way we interact with our electronic devices and each other; that it is prone to error and manipulation by sophisticated criminals / hostile states; and that it is liable to ‘scope creep’ – once accepted in principle, the temptation to scan for other offences and socially objectionable behaviour will become irresistible.

Platform providers point to alternative, less intrusive electronic methods they already deploy to identify suspicious activity, facilitating more targeted law enforcement investigation. Such methods include using non-content metadata indicators to identify suspicious messaging behaviour, preventing adults contacting children whom they do not already know, and encouraging the reporting of harmful messages. Unfortunately, none of these methods is problem-free. For example, metadata analysis can only suggest potential illegal activity and would likely be insufficient for law enforcement to obtain a search warrant for further investigation. Likewise, ‘age-gating’ services is liable to subversion – people lie about their age. While reporting negative behaviours online can be a useful warning tool, it places the onus on victims to flag abuse when they may be too embarrassed or afraid to do so, or they may not even realise what is happening. More sophisticated technological solutions, for example ‘homomorphic encryption’ which allows limited analysis of still encrypted data, potentially assuaging privacy concerns, remain at a developmental stage and are currently too slow to operate at scale.

Speaking plainly

The privacy and safety difficulties thrown up by technology raise impassioned arguments from politicians and campaigners alike which all too easily drown-out ongoing expert efforts to seek a technological resolution to this fraught issue. But achieving a mutually acceptable solution requires trust underpinned by transparency and an informed public debate about the privacy trade-offs involved in maximising safety. Pressing ahead with opaque and controversial legislative measures concerning E2EE without widespread support risks undermining public faith in the integrity of the digital means by which we now routinely communicate, as well as eroding the UK’s model of policing by consent. Were that to happen, the Online Safety Bill would risk being offered up not as a regulatory model to be emulated across the world as the government hopes, but rather as an example of ill thought-through legislative hubris, as well as a recipe for increased mistrust and dispute.

 

Julian Hayes advises companies and individuals in the rapidly developing field of data protection, especially in the context of data breaches and law enforcement investigations, where necessary litigating to ensure that the actions of state authorities are properly constrained. A partner at BCL for four years, he has vast experience of all types of criminal inquiries, including the unlawful obtaining of data and computer misuse offences. He is well-known and highly regarded commentator on cybersecurity and privacy issues. He advises telecommunications operators on their obligations under UK investigatory powers legislation and provides practical guidance on how to handle demands placed upon them, including in establishing systems that work to ensure legal compliance and protection for the operator. He has advised in relation to US-UK Bilateral Data Sharing Agreement and forthcoming UK online harms legislation.

Related articles