Law enforcement and other government agencies regularly ask companies for help in retrieving private customer data – when someone's suspected of a crime, they tap their phones and access their bank records. So, you might well ask, why was there such an uproar over the Apple case–in which the Federal Bureau of Investigation requested the company break into the iPhone of one of the San Bernardino shooters?
It all comes down to privacy, a crucial new battleground between business and government. In this digital age, privacy is as necessary a condition for business to function as free markets and infrastructure. For one thing, there's the need to keep data safe from hackers, from the government and from competitors. For another, being able to assure consumers that the data they often unwittingly hand over will be kept private is crucial to gaining their trust.
Businesses have always collected and stored our secrets – health-care records, legal documents, financial investments, and so on. But the scale and scope of the data collected today – by small retail shops to cable companies to financial institutions – has increased exponentially. These days, some insurance companies collect fitness data directly from our FitBits. Auto makers track our driving habits via in-car computers. Banks use social media to help determine credit scores.
And this stuff isn't stored behind steel doors, flanked by guys with guns. It is stored on servers and in the cloud. And no matter how sophisticated the digital locks protecting it, this data is without doubt more vulnerable.
The digital locks used to protect your data are called encryption, which was at the heart of the Apple-FBI case. The feds had asked Apple to build custom software for that one phone in order to break the encryption. In the end, the FBI let Apple off the hook after figuring out how to crack the phone all on its own. But Apple had argued the request was overly onerous and invasive.
That's because, rather than being a one-off act – like delving into a single safe-deposit box or tapping a single phone line – breaking encryption radically increases the risk that all digital locks will be more easily crackable. Any enterprise that employs encryption has become a custodian of privacy, with the onus of defending customers from incursions by the government itself.
Safeguarding encryption, then, becomes paramount. The digital locks rely on so-called keys–very simply, long strings of numbers that can either be used to decrypt coded information or to act as guarantors, like a digital signature. The reason that breaking encryption is so risky is that it increases the likelihood that those keys will fall into the wrong hands. If and when that happens, it is not just one database that gets cracked–it's any database that shares the same digital lock. In the case of a smartphone, that could affect millions of people.
That's only one reason Apple pushed back so hard against the FBI's request. The other is that, in today's reality, the accumulation of so much data by private companies presents them with a new opportunity for marketing. Don't be concerned about entrusting us with your private information, Apple was saying. We are so committed to protecting it that we are willing to fight the federal government on your behalf.
Canadian law – specifically, the Personal Information Protection and Electronic Documents Act, or PIPEDA – regulates which information private companies are allowed to collect, and Canada tends to be comparatively aggressive in policing privacy. (One example: the Privacy Commissioner's targeting of Facebook for, among other things, oversharing users' private information with third-party app developers.) But there's no doubt privacy will increasingly be a tug-of-war between government and enterprise in this country. Consider the RCMP's recently expressed desire for warrantless access to online subscriber information, or Bill C-13, passed last year under the Conservative government: Ostensibly aimed at eradicating cyberbullying, it could be used to force companies to hand over information encrypted in the cloud.
All of this makes one thing clear: Companies need to have a clear policy around privacy – what data they'll keep, how they'll keep it safe and, perhaps most importantly, how they'll execute their public duty around encryption when it comes to the state.
Beyond that, a company's future can depend on how safe it is perceived to be by a public deeply and increasingly worried about privacy. Which means that, in the modern digital era, secrets themselves might just be the secret to success.
Follow Navneet Alang on Twitter @navalang