Apple CEO Tim Cook rejected an order to help the FBI break into a work-issued iPhone used by a gunman in the mass shooting in San Bernardino, Calif., sparking a rousing debate between security and privacy advocates. Shane Dingman parses the legal and technical arguments and explains why some view this battle as the most important tech case in a decade.
What is happening?
At issue is whether U.S. law enforcement can compel Apple Inc. to unlock an iPhone owned and password-protected by one of its customers.
Some have painted what the FBI is after as a back door to encryption, but the technical details of the case are only nominally about the encryption software that keeps anyone from downloading and reading the data on your iPhone. While the security-versus-privacy argument is similar, what Magistrate Judge Sheri Pym actually ordered is for Apple to enable the FBI to make unlimited guesses at the pass code on the iPhone 5c that was in the possession of San Bernadino gunman Syed Farook.
There are only 10,000 options for a four-digit pass code, but Apple has a feature that would wipe the contents of an iPhone after 10 wrong guesses. That is what the FBI wants Apple to disable. Apple argues that any system it developed to weaken the security of its devices would undermine the privacy and data protection of all its clients.
What does the FBI want from Apple?
Here’s what the actual order says:
“Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.”
The order gives Apple five days to respond.
“The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone,” Apple chief executive Tim Cook wrote in a statement on the company’s website, in as full-throated a defence of the privacy of its users as he has made in public.
“The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers – including tens of millions of American citizens – from sophisticated hackers and cybercriminals,” he wrote.
When reached, Apple spokespersons declined to comment beyond Mr. Cook’s statement.
Is this legal?
As far back as December, 2014, journalists took note of the U.S. Department of Justice making a novel argument for demanding hacking assistance from U.S. tech companies under the 1789 All Writs Act. Now codified as 28 USC 1651, it says: “The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”
In October, 2015, Judge James Orenstein of the Eastern District of New York ruled in another case involving Apple that “the authorities on which the government relies do not support the conclusion that the All Writs Act permits the relief the government seeks. That does not necessarily mean, however, that such relief is unavailable under the statute.” Indeed, Apple is still waiting on a separate ruling from the same judge in very similar case, and more cases with identical Writs Act arguments are before other courts, some involving unidentified tech companies
If the device is locked, how could Apple break in?
Security researchers spent much of Wednesday arguing on Twitter and in forums such as Hacker News whether and how Apple could hack its own encrypted devices. Systems that are for sale right now on the Internet claim to be able to crack pass codes using only access to Apple’s charging port. Some security experts have suggested Apple would need to introduce the modified code by breaking open the case and accessing the iPhone’s computer chip hardware – using the physical test access port and debugging tools, for instance.
If the government gets its way in the present case, it will be more difficult for Apple to argue in the future that the burden of similar orders is undue, says Chris Parsons, postdoctoral fellow with the University of Toronto’s Citizen Lab. Dr. Parsons points out that on the iPhone 5c and older devices, the security is software-based, but on the 5s and newer Apple Secure Enclave, encryption is hardware-based and vastly more difficult to hack. “The government could tell the court, ‘It was difficult last time, and they did it.’ The more dangerous part of the wedge is if Apple’s [current] claims of this being onerous are overridden.”
“Essentially, the government is asking Apple to create a master key so that it can open a single phone,” wrote Kurt Opsahl, deputy executive director and general counsel of the Electronic Frontier Foundation. “And once that master key is created, we’re certain that our government will ask for it again and again, for other phones.”
Why is Apple fighting back?
“This is the most important tech case in a decade,” tweeted infamous former security contractor Edward Snowden, who exposed reams of classified data-collection technologies developed by the U.S. National Security Agency and other government bodies. “The FBI is creating a world where citizens rely on Apple to defend their rights, rather than the other way around.”
But Apple has also expressed concern for its bottom line in previous cases: “Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customer and substantially tarnish the Apple brand,” company lawyers argued in October.
“The government’s request also risks setting a dangerous precedent,” wrote Alex Abdo, staff lawyer with the American Civil Liberties Union’s Speech, Privacy, and Technology Project.
“If the FBI can force Apple to hack into its customers’ devices, then so, too, can every repressive regime in the rest of the world.”
Tech swats away authorities’ pressure
Last summer, Waterloo, Ont.-based BlackBerry Ltd. balked when Pakistan asked for access to encrypted data from the secure BlackBerry Enterprise Service so authorities could monitor e-mail and message traffic in that country. The company said it would close its operations in Pakistan rather than give the government that kind of access. Eventually, Pakistan’s telecom authority backed down, and at the end of the year, BlackBerry said it would stay. BlackBerry has faced similar pressure in India and Saudi Arabia. But chief executive John Chen said recently the company will hand over some private customer data to law enforcement authorities in some circumstances.
In 2014, Rogers Communications Inc. and Telus Corp. challenged orders obtained by Peel Regional Police in Ontario to hand over personal information from about 40,000 mobile-phone users. The “tower dump” would have included all call records from several cellphone towers around the time of a string of jewellery store robberies – data that would have been useful in the criminal investigation. The police revoked the orders after the companies filed their challenge in court, but the case went to a hearing, and last month an Ontario judge ruled that the orders breached the Charter of Rights and Freedoms.
In December, Facebook refused requests from Brazilian authorities to intercept communications data and personal records from its WhatsApp messaging platform that would have been useful in a drug-trafficking investigation. That refusal prompted a Brazilian judge to order a 48-hour suspension of WhatsApp, one of the most popular mobile apps in the country. Outrage ensued among the 100 million Brazilians who use the service for text messages and phone calls, and an appeals court overturned the ruling 12 hours later.
Story continues below advertisement