Are Security Backdoors a Necessary Evil?
As we hear about governments conducting mass online surveillance in the interest of security, more consumers have looked to stronger, on-by-default encryption.
Security agencies across the globe, alarmed by this trend, have been quietly lobbying governments to effectively ban unbreakable encryption.
A very public discussion about the merits and ethics of government-mandated backdoors in cryptography surfaced again early this year with the high-profile standoff between Apple and the U.S. Federal Bureau of Investigation (FBI).
by Eric Pinkerton
The Apple/FBI argument involved the contents of an iPhone used by a deceased terrorist. The FBI’s case seemed simple enough: If the phone was thought to contain details related to the attack, including information regarding the couple’s motivation, or communication with unknown accomplices, then wasn’t it reasonable for Apple to make that information available to investigators?
The FBI asked Apple, via a court order, to provide a signed software file that could disable password protection features. The FBI specified that the code could have a unique identifier, so the feature would load and execute only on the subject device, allaying fears that the code would be misused by the FBI, or find its way into the hands of criminals.
Apple refused to comply with the court order. The company argued that in the wrong hands, the software could be used over and over again to unlock any iPhone. “In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks,” Apple wrote.
Apple and others believe that it’s simply not possible to architect a secure system with a backdoor. They maintain that it’s either a secure system or it’s not.
A backdoor is a means of access that bypasses security mechanisms. One well-known example in the physical world is the Transportation Security Administration (TSA)-approved Travel Sentry lock that hit the market in the aftermath of 9/11.
The lock promised to secure contents of luggage, while still allowing inspection by a trusted party (the TSA) if the need arose. In this case, the backdoor came in the shape of two locking mechanisms, one for the owner and one using a set of master keys that could be carried by TSA agents.
The Travel Sentry locks did not fare well. They spawned many reports of luggage reaching its destination somewhat lighter, thanks to unscrupulous TSA agents, or with cases badly damaged by agents who used force instead of keys to look inside. To add insult to injury, The Washington Post in 2014 published a photo of the keys online, which was good enough for security researchers to use 3D printers to re-create the keys.
The example illustrates how flawed the concept of backdoors really can be, and the inevitability of key material falling into the wrong hands.
A 2015 report (Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications) by a group of preeminent security experts pointed to three issues with the backdoor approach:
- It forces a U-turn from current security best practices.
- It increases system complexity, known to be the enemy of security.
- It creates concentrated targets for bad actors to attack.
“The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict,” the report concluded.
So what’s the answer? For privacy advocates, the holy grail in security is what is increasingly referred to as “zero knowledge.” This is a system designed with strong encryption and clever key management so that even the creator has no knowledge of or visibility into the data it contains. A provider who has taken this approach effectively has an irrefutable claim: It’s simply not possible to pass on customer information to requesting agencies.
Some think the FBI selected the iPhone case to force Apple to take a public stand on the issue of backdoors before the company had the chance to perfect its zero-knowledge solution. That argument appears to be moot for now, as a third party has helped the FBI access the iPhone in dispute.
Although this particular case will, seemingly, be put to rest, the question of providing backdoors in security systems is likely to continue inspiring intense public debate.
Another Backdoor You May Remember
A decade prior to the TSA lock debacle was the Clipper chip, a short-lived key escrow system designed mainly with mobile phones in mind and based on secret cryptography developed by the National Security Agency. The purpose of the Clipper chip was to allow data to be encrypted between A and B, while still allowing inspection by a trusted party should the need arise.
The Clipper chip had some teething problems. It was technically flawed, with numerous security vulnerabilities. American companies ended up paying more to manufacture phones that no one wanted to use. And the U.S. government did not have jurisdiction outside the United States, so while it could insist that U.S. manufacturing companies include the chips in their products, it could not do the same for overseas companies.
Announced in 1993, it was dead in the water by 1996.
Eric Pinkerton is a principal security consultant at CSC.