Skip to main content
opinion

This is part of Life After Privacy, a four-part series on the risks, challenges and opportunities for citizens and consumers posed by a world where your private information is widely available to governments and corporations.

The Debate

We have reached an unprecedented moment in history, in which our personal information now possesses monetary value – and in which that personal information is tracked, accumulated, traded, bought and sold on an increasingly regular basis. Every nugget of our day-to-day lives is worth something.

Why? Because the advertising and marketing business have become increasingly dependent on detailed personal data. The rise of digital technology has exponentially increased the number of places where advertisers can talk to us. But it has also created some serious noise, meaning that if advertisers have any hope of getting our attention, they have to talk to us very directly and personally. They have to base their messages on our personal needs, desires, and life stages, and even upon our locations moment by moment.

Could we be looking at a time when consumers have the power to sell their personal data to the highest bidder? Or where they fight to block any sale or trade in this data? Or, perhaps, where they insist upon wresting control of their lucrative personal information away from third-party brokers, and look to benefit from it more directly? Consumers are going to have to weigh the benefits of personalized marketing against the loss of privacy – and find ways to act on those decisions.

The Debaters

Debate contributor
Susan Krashinsky The Globe's advertising and marketing reporter
Are we really living in the post-privacy age?
Debate contributor
Chantal Bernier Canada's interim federal privacy commissioner
Citizens already have the tools to exercise their right to privacy.
Debate contributor
Pam Dixon Executive director of the World Privacy Forum
We need to fight for a new concept of privacy for the digital era
Debate contributor
Avi Goldfarb Professor of marketing, University of Toronto
There are both benefits and costs to sharing personal information

The Discussion

Debate contributor

Susan Krashinsky: Facebook founder Mark Zuckerberg has suggested that we are in a post-privacy age. That may be a scary thought for many people, but it’s a plausible one. It’s easy to see how frequently our personal information is collected online. In order to avoid being tracked at all, a person would have to go so far off the grid that it would be essentially impossible. Maybe resistance is futile.

Debate contributor

Chantal Bernier: We are moving to a new privacy age, not a post-privacy age. Survey after survey reveals an increase in Canadians' concern for privacy. The increase in complaints to our Office of the Privacy Commissioner shows they resist -- they know why it is important and they are ready to take action. Within weeks of a breach of student-loan information at Employment and Social Development Canada being made public, we had received over a thousand complaints. Within days of Bell's announcement that it was aggregating personal information to serve interest-based ads, we had over 100 complaints. Those are not futile, as we demonstrated a month ago with our findings on Google online-behavioural advertising: not only did we lift the cover on the inner workings of online advertising, but we caused Google to implement greater privacy protection in that regard.

We exercise the right to privacy by choosing what personal information we share. Through technological and social changes the modalities of privacy have changed, but not the principles. We will always need privacy as a matter of freedom and personal integrity. Just as Mark Zuckerberg resists -- he exercises his right to privacy as fundamental to his well-being and he is very concerned about it.

Debate contributor

Avi Goldfarb: I agree that people are more concerned with privacy today than they were in the past. In a research paper, MIT professor Catherine Tucker and I showed that people, when asked to reveal personal financial data, become more likely to refuse to do so as years go by. At the same time, it is important to recognize that there are more benefits to providing such data than in the past. Companies can use data to provide free services and to better serve customers. Some of these services (such as social networks) are only valuable if people provide personal information.

Therefore we see an apparent paradox. People care more about privacy and yet they seem to be revealing more and more about themselves online. Our research suggests a very good reason for this: both the costs and benefits of revealing information have risen over time.

Debate contributor

Pam Dixon: When captains of the digital economy say that privacy is dead, they are right in that privacy by obscurity -- an old paradigm of information being obscured due to the sheer difficulty of digging through dusty paper tomes -- is most certainly going away. In the digital era, data doesn’t hide like it used to when everything was stored on media such as paper.

But privacy is a core human value, and as such, privacy will not be dead anytime soon. Large datasets, for example, present a good thought experiment. How can large datasets filled with incredibly valuable analytical data be used, and at the same time be protective of core privacy values? The answer cannot be a simple “we are afraid, therefore let’s shut down all data use.” Similarly, the answer cannot be, “we want this data, let's ensure unfettered, unconstrained use.” Both approaches are too blunt. We must re-imagine privacy for a digital era.

Debate contributor

Krashinsky: A recent study from Microsoft found that Canadians are open to selling their online data, for the right price. As consumers become more aware of the value of their data, there could be greater demand for it to be brought into the value exchange. That could be the future of how information is used – and paid for.

Debate contributor

Dixon: First, I am not nearly as concerned about online advertising data as I am about deeper data uses of highly sensitive and personally identifiable information. For example, health data is of high sensitivity, and needs careful considerations. Should it be sold?

Second, I am concerned about vulnerable populations. Would the financially vulnerable be more likely to sell highly personal data about themselves, becoming modern-day Les Misérables? Instead of teeth and hair, now it’s your data you sell.

Third, I am concerned about ensuring that the data sold belonged to the individual selling it. This is trickier than it may seem at first, and would involve authentication via strong identity, for example, and potentially via biometrics. Do we really want to build that into our online world for this purpose? It certainly gives me pause.

Debate contributor

Goldfarb: In many ways, consumers are already compensated for their data through high-quality free services and through products that match needs.

The key questions are therefore, first, whether paying people with dollars rather than services makes sense to consumers and to industry, and, second, whether such a system would eliminate most privacy concerns.

Regarding the first question, if the costs of setting up such a system are low enough, then the market can determine the proper price of data just as the market determines the proper price of many other products and services. The second question is much trickier. The problem is that people also reveal information about themselves by choosing not to sell their data. For example, suppose life-insurance companies were allowed to purchase data about people's health status and use that information to determine prices. Healthy people would happily sell their data. Unhealthy people would choose not to sell their data. Then, the insurance companies could infer that the people who didn't sell their data were unhealthy. In other words, the act of choosing not to sell the data ends up providing information. This means that pricing data cannot be a full solution to the challenges of data and privacy.

Debate contributor

Bernier: So we have become aware of the value of our personal information. Good. Our own survey showed that 55 per cent of respondents had walked away from an app or an Internet service because they did not feel comfortable with its proposed use of personal information.

Rules of fair trade of personal information should emerge: Companies must be transparent about what data they collect and what they do with it; they must give us clear, effective means to opt in or out; they must provide all the necessary measures to safeguard the data; and, should they fail in any of these obligations, they must give us remedies for correction. It's important to remember that the Internet functions on personal information. For example, Google reported that over 90 per cent of its revenues come from advertising - and their advertising is largely based on personal information. That means we have the upper hand. The right to privacy then takes the form of voting with our feet: Demanding transparency and clear controls to choose to be in or out and sharing only the personal information we deem fair.

Debate contributor

Goldfarb: I agree with Chantal that the most effective way for consumers to increase privacy protection is by “voting with our feet.” At the same time, I disagree with her interpretation of current consumer trends. Their survey showed that fully 45 per cent of customers had never walked away from a single app or internet service because of comfort with the proposed use of information. This is a very low bar. Consumers access dozens if not hundreds of apps and services over time, and only a slight majority refused even once. I interpret this to suggest that firms have not needed to adjust their practices because, at least so far, consumers have been willing to give companies data without much transparency. This might be driven by consumer ignorance of data practices but it might be that consumers value the services enabled by data, such as Amazon’s product recommendations, Google search, and Facebook.

Debate contributor

Bernier: We do not put enough emphasis on consumer power. Yes, technology can be overwhelming, with the proliferation of algorithms that have us profiled, tracked and pigeon-holed. But it took only one consumer to come to our office a year ago with a concern that Google's online advertising service was violating his privacy. We investigated and Google ultimately agreed to change its ad-monitoring process to ensure compliance with its own privacy policy and Canadian privacy law. Google was cooperative, I believe, out of concern for reputational damage. Google knows consumers care about their privacy – and that its reputation will suffer if it fails to respond to consumer concerns. So, we are not powerless. Just one person can make a big difference. As long as we care about our privacy, business will have to as well.

Debate contributor

Krashinsky: A recent poll from Harris Interactive found that more people trust major online retailers such as eBay and Amazon with their data than trust the government with that personal information. This was an American study only, but still raises interesting questions in Canada and globally about our attitudes toward privacy. The big e-retailers trailed only health providers (doctors and hospitals) in trust (74 per cent trusting those companies versus 79 per cent with health care providers). Search engines and Web portals such as Google and Yahoo had 49 per cent trust, and the government had 48 per cent. Social networks were down at 28 per cent.

That’s a powerful message about consumers’ understanding of privacy: that the trust of companies who mine our data is relatively healthy, and that the services such as Facebook -- where we willingly hand over our personal information all the time -- are most mistrusted. Perhaps our trust is misplaced.

Debate contributor

Dixon: One takeaway for me is that people trust brands. The possible underlying assumption appears to be cogent: a brand has more at stake and may be more responsive and caring of customer data. Regarding distrust of social media, in the long view of things, social media is in its infancy. I would be interested in seeing Twitter, for example, polled on this question separately from Facebook to see if the data reflect a more brand-specific viewpoint. I suspect that in time, social media brands will emerge with their own distinct identities in regards to privacy.

Debate contributor

Bernier: The fact that people still use certain sites in spite of their privacy qualms probably speaks more to social necessity than misplaced trust. For many Canadians, especially younger Canadians, social networks have become the equivalent of what the phone system was for their parents and grandparents. The low trust level may be a result of extensive media coverage of the privacy issues related to social networking sites, but people may feel they have no choice but to go ahead and use them anyway or risk becoming social pariahs.

The current environment underscores the need for strong privacy laws. There is also a broader need to modernize privacy laws to bring them in line with current business models, social trends, technological platforms and the increasing interplay between private and public sector privacy issues. Strong laws will better protect privacy rights and, at the same time, ensure consumer trust in the digital economy.

Debate contributor

Goldfarb: It is important that we get privacy regulation right. With the rise of the data economy, privacy policy is new a key part of innovation policy. If we do not provide enough protection, consumers will be overly wary digital services and the data economy will not succeed. If we provide too much protection, it will inhibit the ability of companies to innovate.

My research with Professor Tucker has shown that the relatively strict regulation of data in Europe has hurt the European internet industry. If Canadians choose to follow the European path, we should be aware that increased privacy is not free. The strict European model of privacy protection comes partly at the expense of innovation in one of the more dynamic sectors of the economy.

Here, Pam’s earlier point that privacy is particularly important for health and other highly personal information is directly relevant. Regulation can vary by sector. For example, Canadians may want strong regulation to protect health data, but we might be willing to allow companies to use many other types of data to provide us with innovative (and often free) services.

Privacy is difficult to define. It means different things to different people. It can even mean different things to the same people in different contexts. People might surrender detailed information about their personal lives to strangers on an airplane. Therefore, discussions about privacy can be challenging. Without a clear understanding of the concept, it is hard to identify problems or propose solutions.