One of the San Bernardino shooters, Syed Rizwan, had an iPhone (it was actually a work phone owned by his employer). Now the FBI has it but they can't unlock it. They've looked at other info, they can get call records from Verizon and they asked for and Apple gave them iCloud backups of the device, but those stopped in October. Now the FBI wants Apple to provide them with software to help them unlock the phone. Apple has said no but the FBI got United States Magistrate Judge Sheri Pym to issue a writ to compel Apple to help them, and Apple has said no.
Techdirt in a addition to a description of the order, has the actual order itself. No, A Judge Did Not Just Order Apple To Break Encryption On San Bernardino Shooter's iPhone, But To Create A New Backdoor. Wired has it too, pdf.
Here's Apple's reply, A Message to Our Customers. I think it's very well worded. Here's the heart of it:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Here's a technical analysis by Dan Guido saying, Apple can comply with the FBI court order. It's mostly because the iPhone in question is a 5C which doesn't have TouchID or the Secure Enclave as later models do. More info: Errata Security: Some notes on Apple decryption San Bernadino phone and tl;dr Apple’s technical capabilities under FBI AWA order.
There's also a debate about the governments use of the All Writs Act to compel Apple. I've seen some debate on the point, but Apple says this is "unprecedented".
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Now the writ itself says on page 11:
Further, based on the authority given to the courts under the All Writs Act, courts have issued orders, similar to the one the government is seeking here, that require a manufacturer to assist in accessing a cell phone's files so that a warrant may be executed originally contemplated. See, e.g., In re Order Requiring [XXX], Inc. to Assist in the Execution of a Search Warrant Issued by This Court by Unlocking a Cellphone, 2014 WL 5510865 at *2 (S.D.N.Y. Oct. 31, 2014)
I wonder what company XXX is? Apparently even Judges being asked to issue similar writs don't know (see footnote at bottom of page 8 in this pdf! Ars Technica says the All Writs Act has come up in other cases, but cites several experts saying "The DoJ went with the nuclear option" and they can't think of similar precedents.
Joshua Gans writes about Game Theory and Apple’s Encryption Challenge. It's interesting that Apple can do this but the FBI (or NSA) can't. Also interesting is that Apple can't do this on more recent iPhones and this is clearly another salvo in the debate of should companies like Apple be able to make devices with secure encryption technology.
Ben Thompson provides a really good summary and makes some good points in Apple Versus the FBI, Understanding iPhone Encryption, The Risks for Apple and Encryption.
This solution [a government backdoor using a master key in future products] is, frankly, unacceptable, and it’s not simply an issue of privacy: it’s one of security. A master key, contrary to conventional wisdom, is not guessable, but it can be stolen; worse, if it is stolen, no one would ever know. It would be a silent failure allowing whoever captured it to break into any device secured by the algorithm in question without those relying on it knowing anything was amiss. I can’t stress enough what a problem this is: World War II, especially in the Pacific, turned on this sort of silent cryptographic failure. And, given the sheer number of law enforcement officials that would want their hands on this key, it landing in the wrong hands would be a matter of when, not if.
This is why I’m just a tiny bit worried about Tim Cook drawing such a stark line in the sand with this case: the PR optics could not possibly be worse for Apple. It’s a case of domestic terrorism with a clear cut bad guy and a warrant that no one could object to, and Apple is capable of fulfilling the request. Would it perhaps be better to cooperate in this case secure in the knowledge that the loophole the FBI is exploiting (the software-based security measures) has already been closed, and then save the rhetorical gun powder for the inevitable request to insert the sort of narrow backdoor into the disk encryption itself I just described?
Then again, I can see the other side: a backdoor is a backdoor, and it is absolutely the case that the FBI is demanding Apple deliberately weaken security. Perhaps there is a slippery slope argument here, and I can respect the idea that government intrusion on security must be fought at every step. I just hope that this San Bernardino case doesn’t become a rallying cry for (helping to) break into not only an iPhone 5C but, in the long run, all iPhones.
Rich Mogull in Why the FBI's request to Apple will affect civil rights for a generation says:
The crux of the issue is should companies be required to build security circumvention technologies to expose their own customers? Not “assist law enforcement with existing tools,” but “build new tools.”
The FBI Director has been clear that the government wants back doors into our devices, even though the former head of the NSA disagrees and supports strong consumer encryption. One reason Apple is likely fighting this case so publicly is that it is a small legal step from requiring new circumvention technology, to building such access into devices. The FBI wants the precedent far more than they need the evidence, and this particular case is incredibly high profile and emotional.
He also previously explained Why Apple Defends Encryption "Apple is nearly unique among technology firms in that it’s high profile, has revenue lines that don’t rely on compromising privacy, and sells products that are squarely in the crosshairs of the encryption debate. Because of this, everything Apple says about encryption comes from a highly defensible position, especially now that the company is dropping its iAd App Network."
As he points out, "Google is fundamentally an advertising company that collects data on its users." For Google to use that info, it has to have access to it, so available to the government with a warrant. Microsoft until recently has a history of working with the government. Most of Facebook's and Twitter's information about you is already public.
The ACLU is, not surprisingly, on Apple's side here. "“This is an unprecedented, unwise, and unlawful move by the government. The Constitution does not permit the government to force companies to hack into their customers' devices. Apple is free to offer a phone that stores information securely, and it must remain so if consumers are to retain any control over their private data."
Even a Congressman agrees with Apple, it's probably no coincidence it's one with a computer science degree from Standford, Rep Ted Lieu (D-CA), Congressman Lieu Statement on Apple Court Order .
This court order also begs the question: Where does this kind of coercion stop? Can the government force Facebook to create software that provides analytic data on who is likely to be a criminal? Can the government force Google to provide the names of all people who searched for the term ISIL? Can the government force Amazon to write software that identifies who might be suspicious based on the books they ordered?
Forcing Apple to weaken its encryption system in this one case means the government can force Apple—or any other private sector company—to weaken encryption systems in all future cases. This precedent-setting action will both weaken the privacy of Americans and hurt American businesses. And how can the FBI ensure the software that it is forcing Apple to create won’t fall into the wrong hands? Given the number of cyberbreaches in the federal government—including at the Department of Justice—the FBI cannot guarantee this back door software will not end up in the hands of hackers or other criminals.
Researching this I learned this is not the first time that a government agency has used the All Writs Act to coerce Apple to break into an iPhone. The EFF wrote about a case last October, Judge to DOJ: Not All Writs. The documents for that case are here and apparently the Judge still has not ruled on the matter.
In one of those documents the government says:
Apple has an established track record of assisting law enforcement agents by extracting data from passcode-locked iPhones pursuant to court orders issued under the All Writs Act. The government has confirmed that Apple has done so in numerous federal criminal cases around the nation, and the vast majority of these cases have been resolved without any need for Apple to testify. In the course of handling these requests, Apple has, on multiple occasions, informed the government that it can extract data from a passcode-locked device and provided the government with the specific language it seeks in the form of a court order to do so.
Interesting past history, but they're clearly trying to move away from this and it's for the privacy of their customers. I'm on Apple's side here. Not the least of reasons is that our own government is when other governments try to do this. This is from March 2015, Obama sharply criticizes China's plans for new technology rules.
In an interview with Reuters, Obama said he was concerned about Beijing's plans for a far-reaching counterterrorism law that would require technology firms to hand over encryption keys, the passcodes that help protect data, and install security "backdoors" in their systems to give Chinese authorities surveillance access.
"This is something that I’ve raised directly with President Xi," Obama said. "We have made it very clear to them that this is something they are going to have to change if they are to do business with the United States."
Drew Hartwell gives a nice overview of Tim Cook's recent advocacy, Tim Cook just escalated Apple’s fight with the FBI – and his own role as corporate activist
On a good note, The President's NSA Advisory Board Finally Gets a Tech Expert. Steve Bellovin is a great choice.
No comments:
Post a Comment