End-to-end (E2E) encryption-the highest publicly available standard for securing digital communications-is under threat from governments concerned about its implications for public safety and security. Privacy advocates are sounding the alarm and the gulf between the two sides seems virtually unbridgeable.
While the masking of sensitive communications has happened through various means over millennia, E2E encryption is unique. It uses “asymmetric” cryptography, in which the parties to a communication channel each have both a “public key” and a “private key” associated with their communication device. The public key allows others to encrypt messages so that only the intended recipient can decipher the plain text using their own private key.
The confidentiality and security of iMessage, FaceTime, WhatsApp, Telegram, Viber and Signal are all reliant on the strength of this technology. At least in principle, no third-party, service provider or application developer should be able to decrypt communications being transmitted through these channels. Nor should intelligence services, law enforcement or any kind of criminal entity be able to intrude or interject.
Of course, some points of vulnerability do still exist in E2E-encrypted services: users’ privacy could be undermined by malware with access to keyboard, camera or microphone inputs and off-device backups (e.g., Apple’s iCloud) could be breached or accessed under court order. Communication records stored locally would also be accessible to anybody with the ability to obtain and unlock a device.
However, by and large, messages and calls masked in this fashion are still safe from prying eyes, at least in transit. If devices are secured properly and no off-device is accessible, it can be challenging for even the U.S. Federal Bureau of Investigation (FBI) to gain access.
In fact, this precise set of preconditions led U.S. federal investigators to initiate a court case against Apple in early 2016 in an effort to compel the company to create a “backdoor” to its phones. This was intended to allow the FBI to break into the iPhone of a deceased terrorist and to access those devices on a larger scale in future. Apple stated that to do this would render its products fundamentally defective in terms of protecting users’ data and, ultimately, that legal contest ended in something like a ceasefire. The FBI paid $900,000 to third-party hackers for a passcode bypass and Apple doubled down, at least publicly, on its commitment to user privacy.
The FBI’s argument, and the broader case made by governments for an encryption workaround, centers on the notion that some degree of accessibility to E2E-encrypted communications is necessary for intelligence and law enforcement agencies to maintain their investigative and surveillance capabilities, and hence preserve public safety and national security.
The idea that intelligence services’ surveillance capabilities are substantially compromised by encryption is almost certainly fallacious. However, the law enforcement side of this argument is more reasonable, particularly given that, coupled with the potential for permanent deletion of messages, E2E encryption creates an opportunity for criminal enterprises to coordinate and plan activities with impunity; the so-called “going dark” problem.
From the orchestration of civil disobedience and other nonviolent protests to the preservation of highly personal data from cybercriminals or employers, there are many legitimate reasons why citizens might wish to be able to communicate with discretion. Any technical workaround to E2E encryption would therefore need to be difficult to abuse, both by bad actors and by legitimate agents acting outside of their constitutional bounds.
Unfortunately, weakening encryption standards to enable access to communications by law enforcement would inevitably mean increasing the insecurity of all covered devices and services. What is crucial to note, according to former White House Senior Director for Cybersecurity Policy, Andrew Grotto, is that “Whatever we want to call it-a front door, a back door, a sliding-glass door-this kind of vulnerability would be a potential entry point for good guys and bad guys alike.”
The core of the problem, according to Michael Daniel, President and CEO of the Cyber Threat Alliance and former White House Cybersecurity Coordinator, is that “The encryption debate is a security-versus-security debate. Strong encryption protects our digital ecosystem from threats and abuse but can also inhibit law enforcement’s efforts to prevent or disrupt crimes. It is therefore up to societies to decide how much security of the first kind they are willing to trade for the second and vice-versa.”
As yet unsuccessful efforts in the U.S. Congress to mandate the creation of such a mechanism have suffered from a lack of clarity regarding what tangible obligations would be placed on technology providers. Similarly, legislation hastily enacted by the Australian parliament in late 2018 that placed broad burdens of cooperation on technology companies, including assistance working around E2E encryption, has been problematic.
Attempting to resolve all of this problem’s inherent tensions at once is simply impracticable. However, one example of what a more realistic first step towards compromise might look like was recently identified by a working group at the Carnegie Endowment for International Peace. Their proposal for a locally stored, device-specific “master key” would allow law enforcement agents to access data at-rest on devices in their physical possession, providing a resolution in the kind of atypical “edge” case referenced above without systemically undermining the functionality of E2E-encrypted services.
Even so, two objections to even such a highly targeted solution still bear consideration. First, that governments in China, Russia, Saudi Arabia and elsewhere in the illiberal world might paint it as providing normative justification for their own existing and future systems of comprehensive state surveillance. Second, that such a workaround could still be abused by insufficiently constrained law enforcement elements even in democratic states.
Beyond the need for technical ingenuity, implementing such a least-worst compromise would therefore require a complementary reinvigoration of checks on government power. Adoption should come through a multilateral framework, while associated constraints on individual governments’ use of this functionality would need to be strong and consistent.
In terms of policy questions arising at the intersection of technology, ethics and state power, this is small fry compared to what we will face in the coming decades with the advent of true A.I., quantum computing, smart cities and more. Demanding more zealous oversight of government surveillance capabilities at this stage is thus essential as these will only continue to grow more impressive and expansive in the decades to come.