Apple CEO Tim Cook has taken a strong and principled stand to protect the privacy of Apple's customers and smartphone users everywhere. He is refusing to comply with the order of a California judge that Apple help the FBI break the encryption on the iPhone of one of the San Bernardino terrorists by building a back door. Apple is not opposing the order because it sympathizes with the terrorists, far from it – this was a terrible tragedy. Thus far, Apple has done everything it can to co-operate with law enforcement.
This order, however, goes one step too far. It compels Apple to develop special software to disable the security features offered to iPhone owners that effectively wipe the phone's contents, including the stored decryption key, after 10 failed attempts. This would then allow the FBI to use "brute force" to break the encryption offered by iOS on which millions of consumers rely to protect the privacy and security of their information.
Make no mistake: Building a back door into the San Bernardino iPhone to access encrypted data would build a back door into all iPhones of a similar model. The operating system software is essentially the same; thus, opening one iPhone to access personal data would open all others. And once the back door has been built, it would be available not only to the U.S. government, but foreign governments and criminal hackers around the world. Why? Because such a back door would be in great demand and the sources for it would grow geometrically.
It would reside within Apple itself with the potential for leaks; but even more disturbing, it would reside within the U.S. government, with all the ensuing problems of keeping it restricted. And once it was in the hands of the criminal element, the security and privacy that we all hold dear with our iPhones would no longer be the case. Should the U.S. government be allowed to force Apple, a private company, to jeopardize the security and privacy of its users? I say no.
But if the government was successful, what would be the result?
First, criminal hackers and terrorists would simply no longer use these products. A recent survey conducted by crypto expert Bruce Schneier found that of the 619 entities selling encryption products, 412 (two-thirds) were not based in the United States. Those non-U.S. entities provide 567 competing products. So the bad guys would not use iPhones any more.
Second, personal data, such as medical records, financial/banking data, location data, personal contacts, etc. could be exposed to governments and criminals without the consent of the individuals involved – say goodbye to freedom and privacy.
Third, over time, people would stop using products that governments could hack into. Such a market shift would compromise the prosperity of U.S. companies and potentially depress the remarkable innovations they have demonstrated.
Encryption is a vital tool to protect our fundamental rights in a digital world. Millions of people benefit, in numerous ways, from the privacy and security offered by the iPhone and similar products. The U.S. government would be jeopardizing those benefits for the sake of hacking into one iPhone, which would no doubt lead to additional cases of hacking. Interestingly, governments love to use the "greater good" argument as justification for their actions. I submit that the greater good in this case would be served by not building a back door into the iPhone. This is yet another example of government overreach.
If Apple is forced to do this, it will take its toll on innovation and prosperity, which will further erode our privacy and freedom. That is simply too high a price to pay.
Ann Cavoukian is executive director of the Privacy and Big Data Institute – Where Big Data meets Big Privacy at Ryerson University in Toronto, and was information and privacy commissioner of Ontario from 1997 to 2014.