One of the most common forms of biometric identification is when our face is compared with a stored facial image, such as a driver's licence or passport photo. Facial-recognition technology automates this process.
First, a biometric "template," or representation of you, is generated from measurements of your physiological traits (in this case, the image of your face), and this template is retained in a database. Further samples from captured facial images may then be compared against this template - if there's a match, then you're identified.
Imagine a scenario where you're walking down the street or attending a sports event or shopping at a mall, and your photo is taken, identified, tagged and matched against a database of facial templates, without your knowledge or consent. This would be an affront to privacy that should not be tolerated.
Two key developments are making this scenario possible. First, sophisticated, high-resolution cameras in surveillance systems - and now conveniently embedded in our mobile devices - are allowing for the frequent capture of high-quality facial images "on the move." Second, software is now available that is capable of indexing vast numbers of photos, allowing for the creation of biometric databases. All of the photos we put on the Internet and social media, as well as other information about us that allows for the tagging of these photos, may now be accessible. Taken together, this makes it much easier to become automatically recognized, and far more accurately than before.
Your facial image and identity are your personal information. Being unique in nature, this biometric identifier can represent you in the digital world, and may be misused, lost or stolen, leading to potential matching, tracking, impersonation and other deceptive practices. Accordingly, there are significant privacy and security challenges to facial recognition that must be overcome to ensure that any "unanticipated" negative effects are avoided. Beware of unintended consequences!
The most serious is the linkage of your biometric template across multiple databases, for uses that were never intended. One's identity may now be routinely shared online by others, as well as one's personal profile and geo-location data. When facial recognition becomes widespread, your biometric template could be used to identify you in multiple databases.
Privacy is all about freedom of choice and personal control. We need to realize that the same technology that serves to threaten privacy may also be enlisted to its protection. This entails the use of Privacy by Design - embedding privacy directly into technologies and business practices, resulting in privacy and functionality.
But video surveillance and facial recognition need not be privacy-invasive. A system using biometric encryption is highly privacy protective, yet accurate and secure, leaving no digital trail of biometric templates behind. It's a solution that doesn't store the biometric template itself but rather a "private" template in which the biometric is irreversibly bound to a cryptographic key. It's currently being used by the Ontario Lottery and Gaming Corp.
The OLG serves millions of repeat customers a year, at numerous gaming facilities. For self-declared problem gamblers in Ontario, the OLG maintains a totally voluntary self-exclusion program that allows individuals to be removed from OLG facilities. This program is being carried out with the help of an innovative made-in-Ontario facial recognition system that only identifies possible matches with registered gamblers, while ignoring the vast majority of regular visitors, who remain anonymous.
Thanks to careful Privacy by Design planning, innovative use of advances in biometric encryption, and effective data stewardship, Ontario has a privacy-enhanced facial recognition system that can serve as a model for others around the world. Not only is it possible to have facial recognition and privacy, it's now a reality - and it's a win/win strategy.
Ann Cavoukian is Information and Privacy Commissioner of Ontario.