The Federal Bureau of Investigation has possession of the Apple iPhone of one of the shooters in San Bernardino. The problem is that the FBI cannot read it.
This week a federal court ordered Apple to grant the FBI access to the shooter's phone by hacking its own product.
The hack is necessary because much of the information stored on iPhones is encrypted. Encryption encodes information in such a way that it is unreadable to anyone who lacks a secret key. Sometimes companies store backups of this key. But the iPhone key is generated in part by using the phone passcode created by a user and even Apple does not know that passcode. What is more, Apple has designed the phone with guards explicitly against the kind of brute force passcode guessing that the FBI wants to conduct.
According to Trail of Bits CEO Dan Guido, there is a time delay between guesses of the iPhone passcode and too many wrong entries can cause the users' data to be wiped completely. The FBI has targeted these security features specifically, demanding that Apple “bypass or disable the auto-erase function” and “ensure that when the FBI submits passcodes to the subject device, software running on the device will not purposefully introduce any additional delay.” The basic idea is to allow the FBI to guess passcodes ad infinitum and as rapidly as they want.
Apple CEO Tim Cook has said that the request is akin to asking Apple to hack its own customers. Nate Cardozo, staff attorney for the privacy watchdog Electronic Frontier Foundation, argues that such a stance invalidates the commitments of Apple to their customers: “Apple is being compelled to create computer code that is false in a very meaningful way,” he says.
The FBI, meanwhile, wants access to the phone in order to investigate possible ties of the shooter to radical Islamic terrorist groups, as well as to conduct a broader investigation of their communications.
Besides just San Bernardino, some think that the FBI has broader aims. There has been significant discussion as of late over whether Congress should mandate a “back door” into encrypted data. By a “back door” companies would standardize a means for authorized parties to overcome security and gain access to protected data. The San Bernardino shootings highlight important questions. Who should hold the keys? How should parties be authorized to seize them, if at all?
Security versus privacy
Thus emerges a common theme: a trade-off between security and privacy. The FBI, it could be argued, is attempting to provide security to US citizens by investigating the shootings, and Apple is trying to protect its customers by refusing to open up a precedent for government seizure of their data.
This debate has been simmering for some time. In December 2013, for example, the US government issued a search warrant that demanded that Microsoft give access to the accounts of one of its users allegedly involved in the drug trade and money laundering. Microsoft refused. Because the data was stored in Ireland, the company contended, the US could not demand access.
“Imagine this scenario [Microsoft argued]. Officers of the local Stadtpolizei investigating a suspected leak to the press descend on Deutsche Bank headquarters in Frankfurt, Germany. They serve a warrant to seize a bundle of private letters that a New York Times reporter is storing in a safe deposit box at a Deutsche Bank USA branch in Manhattan. The bank complies by ordering the New York branch manager to open the reporter’s box with a master key, rummage through it, and fax the private letters to the Stadtpolizei.”
It is easy to imagine Americans recoiling.
Despite the apparent conflict, through, security and privacy are not always at odds. Locking your front door, for instance, effectively addresses both security and privacy. On the technical side, so do spam and phishing filters, which try to prevent identity theft. If we can get beyond the security versus privacy paradigm, we can dream up new solutions which improve one without hurting the other.
Those solutions need not be technical. In the physical domain, a search warrant addresses the security-privacy challenge from a legal perspective. And some solutions are what we might call conventional; privacy expert Helen Nissenbaum often refers to an envelope as a measure that achieves security and privacy in most cases with ease and simplicity. Social convention and popular morality supports the idea of leaving sealed envelopes sealed with little technological expense.
Apple and the shooter’s iPhone
Fine. But what about the solution to the Apple case? The context is a criminal investigation and a search authorized by a warrant. A first impression suggests that Apple is obliged to hand over access to the phone.
But Apple does not already have that access. The FBI's request makes use of the “All Writs Act of 1989,” which requires companies to aid in the execution of a court order such as a search warrant, with the conditions that the aid not be “unreasonably burdensome.” Because the FBI is asking Apple to write new code to circumvent existing security protections, the request could be understood as burdensome.
Perhaps legislation on encryption standards is actually a more systematic solution. While asking companies to open back doors has its own problems, at least bringing the matter to Congress would address an issue that is important not only for the San Bernardino shootings, but also in security and privacy at large.
Jeffrey Pawlick is a PhD Candidate in Electrical Engineering at the Tandon School of Engineering, New York University.