Skip to main content

Anne-Marie Slaughter is president and CEO of New America. Stephanie Hare is a researcher, broadcaster and fellow at Foreign Policy Interrupted.

Technological developments in recent years have highlighted not only the benefits of big data, but also the need to come to terms with the dangers it poses to our privacy, civil liberties and human rights. Nowhere is this question more relevant than with the latest source of that data: our bodies.

Law-enforcement authorities around the world are building and using technologies to identify us from our biometrics, including our face, fingerprints, DNA, voice, iris and gait. Long used in passports and at border crossings, these unique identity markers have many other applications.

Story continues below advertisement

For years, we have allowed governments and companies to gather and analyze our biometric data whenever we have applied for a driver’s licence, a travel visa, naturalization or certain jobs, or even simply gone to an amusement park. Increasingly, we use our fingerprints or our faces to unlock our smartphones, pay for purchases and board airplanes.

The protection against theft is obvious: What use is a phone, car or ticket that will only work for its legitimate owner? Above all, biometrics can protect against theft of our identities themselves.

That is the argument behind the world’s largest biometrics project, a multimodal solution (iris, fingerprints and face) affecting more than one billion Indians. Nandan Nilekani, the chairman of Infosys who left his job to create the system, known as Aadhaar, credits it with saving the Indian government roughly US$9-billion by eliminating duplicate and false identities in government beneficiary lists.

Thanks to Aadhaar, more than a half-billion people have connected their digital IDs directly to a bank account, allowing the government to deposit more than US$12-billion without the risk of fraud, theft or – especially important for women – the male drinking and domestic violence that frequently accompanies sudden infusions of cash. For many of India’s poor, living in unmapped villages or slums, a digital ID gives them official personhood – just as a birth certificate or social-security number does in developed countries.

But biometrics increases the likelihood of Jeremy Bentham’s Panopticon, the dystopia of an all-seeing state. China makes no effort to hide its use of biometrics and artificial intelligence (AI) to police its population. Less well known is the advanced use of biometrics in liberal democracies.

In the United States, a study in 2016 by the Center on Privacy & Technology at Georgetown University Law Center found that the facial images of more than 117 million Americans – nearly half of all U.S. adults – were held in U.S. law-enforcement databases, some of which are accessible by the FBI. Next month, Customs and Border Protection will start using a new facial-recognition technology as part of a larger Biometric Exit Program already operating in the airports of eight U.S. cities.

In Britain, the facial images of 12.5 million people, hundreds of thousands of whom are not guilty of a crime, are stored in the National Police Database (NPD), while HM Customs and Revenue (HMRC) has gathered more than five million voice recordings without consent. This defies a 2012 British High Court ruling that ordered the Home Office to delete face and voice biometrics of detainees who have been released without charge or acquitted – in line with the law requiring the deletion of DNA and fingerprints.

Story continues below advertisement

The collection and storage of people’s biometric data fundamentally changes the relationship between citizen and state. Once presumed innocent, we are now, in the sinister words of former British home secretary Amber Rudd, “unconvicted persons" – people who have not been found guilty of a crime, yet.

This shift has not gone unchallenged. In Britain, the South Wales police and the Metropolitan police face legal action from Liberty and Big Brother Watch, respectively, for their use of automatic facial recognition. In the United States, the city of Orlando, Fla., has abandoned its trial of Amazon’s Rekognition facial-recognition software.

India’s biometrics system also faces legal challenges. While the government made signing up to Aadhaar voluntary, it is effectively mandatory for anyone who needs to access government services, open a bank account or obtain a mobile phone contract. Yet, obliging Indians to use Aadhaar became illegal in 2017, after the Supreme Court ruled that the “right to privacy … [is] an intrinsic part of the right to life and personal liberty.” The Court upheld the government’s authority to curtail privacy rights for a compelling reason, such as national security, crime prevention or social welfare; but the action must be reasonable and proportional to the end sought.

More worrying is that Aadhaar is not secure. In January, 2018, reporters at The Tribune newspaper in India paid 500 rupees (around $9.50) to get a login and password that enabled them to access the name, address, postal code (PIN), photo, phone number and e-mail of every person in the system. For just 300 rupees more, the reporters could print out – and start using – copies of anyone’s unique identity cards.

Years of massive data breaches in the United States (affecting companies such as Target, Yahoo, LinkedIn and Intel, as well as the federal government’s Office of Personnel Management), and reports of companies such as Facebook and Google handing over personal data to developers and other third parties, have led to little concrete change. This may reflect a lack of incentives: While identity fraud resulting from such breaches is tiresome and time-consuming to resolve, any financial pain is ultimately borne by banks and credit-card companies.

It is a different world of pain when our biometric data is compromised, because unlike our usernames or passwords, biometric data cannot be reset. Moreover, errors are even harder to correct. And when paired with other data about us (financial, professional and social), our biometrics can be fed into algorithms and used to deny us loans, health insurance and jobs, guess our sexuality or political preferences, and predict our likelihood to commit crimes – entirely without our knowledge.

Story continues below advertisement

Having a unique, unforgeable identity could be a blessing. But we must identify and protect against the many ways that it can become a curse.

Copyright: Project Syndicate, 2018. www.project-syndicate.org

Report an error
Comments

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

If you would like to write a letter to the editor, please forward it to letters@globeandmail.com. Readers can also interact with The Globe on Facebook and Twitter .

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff.

We aim to create a safe and valuable space for discussion and debate. That means:

  • All comments will be reviewed by one or more moderators before being posted to the site. This should only take a few moments.
  • Treat others as you wish to be treated
  • Criticize ideas, not people
  • Stay on topic
  • Avoid the use of toxic and offensive language
  • Flag bad behaviour

Comments that violate our community guidelines will be removed. Commenters who repeatedly violate community guidelines may be suspended, causing them to temporarily lose their ability to engage with comments.

Read our community guidelines here

Discussion loading ...

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.
Cannabis pro newsletter