If you told someone prior to 2013 that they should really think about encrypting the messages they send and receive online – wrapping casual instant messages, serious business e-mails, even sexts in a layer of digital protection from eavesdroppers – you would probably have been dismissed as overreacting.
Now, here we are.
More than a year after former defence contractor and whistleblower Edward Snowden revealed myriad instances of mass government surveillance – not to mention the disclosures here in Canada reported by The Globe and Mail – it’s become clear that nearly anything you transmit over the Internet is fair game for interception by someone, somewhere.
This new reality has spawned – or perhaps more accurately, accelerated –– the development of apps and services promising secure, encrypted and private communication. The question is, how do we get these tools into the mainstream? And more importantly, how can less savvy users know what to trust?
Perhaps one answer is to make encryption invisible, so that most users won’t even know there’s strong encryption happening at all.
Signal, an iPhone app released at the end of July, is a simple alternative to Skype, FaceTime or Google Voice for making voice calls over the Internet. It’s clean design and painless integration with your contact list make it as easy to use as anything that comes pre-installed on the iPhone.
But unlike other apps, displayed alongside the duration of the call and other status information are two seemingly random words. So, to your friend, or lover or secretive confidant on the other end of the line, you start your conversation by asking “Does your screen say ‘angry pirate?’ ” And if the words match up – just like that! – you’ve confirmed that there’s no one in the middle of your connection eavesdropping on your call.
Cryptography is “the reason why the app exists to begin with,” says Christine Corbett Moran, co-lead of the iOS team at Signal developer Open WhisperSystems, during a conversation over – what else? – Signal. “But we want it to feel like an afterthought to the user, not something that gets in their face. They just want to use a really good, free messaging app with all their friends – and it just so happens that its the most secure app that money, or free, can buy.”
You don’t always have to confirm the words with the person on the other end of the line, says Ms. Corbett Moran. Every call with the app is encrypted and secure by default, but the option is there, should the time come when you want to ensure the privacy of a call.
It’s a good strategy – a nice mix of security and simplicity, all at once. “The user interface has to basically only allow secure communications,” says Matthew Green, a cryptographer and research professor at John Hopkins University. “The minute you allow a choice between secure and insecure, people are either going to screw it up and they’re going to send the secure things insecure, or they’re just going to send everything insecure.”
Perhaps the most successful implementation of this philosophy, in Mr. Green’s view, is actually Apple’s iMessage. Every message is encrypted end-to-end between the sender and receiver – a process that’s never apparent to the end user. The caveat, says Green, is that “you’re trusting apple to manage your [encryption] keys for you, which is the trade off that you get for having something that’s really easy to use and ubiquitous.”
Some have criticized Apple for this, claiming that their control over encryption keys gives the company a backdoor, or a way to decrypt users’ messages (Apple denies this). But whatever the case, Nadim Kobeissi agrees that this approach to usability is sound.
“I think the best way to do this would be to implement encryption everywhere, but not say anything about,” says Mr. Kobeissi, a Montreal-based developer who is a competitor of sorts to Open WhisperSystems, and has taken similar approaches in his work.
His secure chat application Cryptocat, for example, dispenses with one of the big headaches of typical encrypted communication, which is the exchanging of encryption keys. You can still see all of the underlying cryptographic information if you choose, of course, but it’s not necessarily front and centre to the experience. Again – it’s just a good chat app that happens to be built on good encryption.
At the HOPE hacker conference in New York last month, Mr. Kobeissi released another app called miniLock, a secure filesharing service that he views as a modern successor to PGP (PGP, which stands for Pretty Good Privacy and dates back to the 1980s, is still one of the gold standards for encryption today). MiniLock dispenses with the complicated process of creating, sharing and verifying keys, which are often many, many lines long. You just need someone’s miniLock ID, which is small enough to fit into a tweet.
But what could make both of these apps truly accessible to mainstream users is that there is nothing to install. Both Cyrptocat and miniLock can run in a user’s browser – something that was once thought unthinkable in the cryptographic community. After all, everyone has a browser, and the number of things we do in our browsers has, in recent years, only increased. Moving encryption from the command line to your address bar only makes sense. Now, both Google and Yahoo have taken note, and this summer announced they are implementing their own browser-based cryptography for encrypting e-mail as well.
“When that started happening I yelled a giant ‘I told you so’ to all my peers in the field,” says Mr. Kobeissi “And they still make fun of me about it today – how ‘I told you so’-ish my attitude was. But I don’t regret it. I couldn’t help it. Because I really was fighting an uphill battle for the longest time.”
Of course, as these tools become easier to use – as simple as installing an app or opening a page in your browser – you could argue that it will become harder to know who and what to trust. Even now, it already is. Short of giving people a small list and saying “don’t use anything else,” it’s a hard problem to solve.
There are a few general rules that most people in the cryptography and security community will tell you: that you shouldn’t trust apps or services that don’t have technical explanations on their site, and haven’t had their code audited or checked by credible third-party researchers. Their code should also be open source and available for anyone to examine, and written by cryptographers and computer scientists who are known to others in the community.
Otherwise, “they’re probably selling you snake oil,” says Mr. Green. “That’s the usual rule I use. There are a few exceptions to that, but not very many.”
Things that don’t meet at least some or all of those rules may not be insecure, but a good real of thumb is that, until proven otherwise, its common sense to treat them as if they are.
Signal may not ever be the only app people use to communicate with one another, says Ms. Corbett Moran, but she’d like to see it join the pantheon of apps that you can realistically expect your friends and family to have installed on their phones by default. In the same way that some friends only communicate via Facebook Messenger, or others via Google Hangouts, there may be a day when Signal is installed with your other communication apps too.
“The critical mass is where maybe not everything gets sent by default encrypted,” says Mr. Green, but the ability exists. “So if I want to send you an encrypted message right now, we don’t have to spend 30 minutes setting it up. It’s just there.”