Skip to main content

Markets were sent reeling after Britain's unexpected vote to leave the European Union. Less discussed in the analysis that followed this result was what lessons we can all take from our collective failure to predict it. The Brexit shock is a perfect instance of two crucial decision-making failure modes: overreliance on data and the presence of biases.

In this era of analytics and data, it's almost refreshing to see how wrong our celebrated predictive machines can be. Bookmakers, who stood to lose real money (and who thus invest a lot in getting this sort of thing right), were predicting with 90-per-cent certainty that the Remain side would win, right up until the polls closed and the results came in. Social-media analysts were celebrating how their real-time, enormous sample sizes were the most accurate indicator of how the vote would go, and were also strongly predicting that Remain would triumph.

While these groups rapidly backpedal, looking for the faults in their algorithms – that too many rich people and millennials skewed the data sets seems to be the latest justification – we look at this as a lesson in the dangers of overreliance on analytics to predict the future. Yes, data and analytics are important. But such information is by nature based on historical fact (a dangerous tool for predicting the future) and limited samples (a poor indication of the intentions of a homogeneous group).

Biases are another important part of the reason so many predictions called this wrong. Just as we missed the "impossible" rise of Donald Trump, Brexit caught us by surprise because most of us (and most of the people who work in the news media) wouldn't have voted for either one. Our own views and desires bias us to expect others to act in the same way we would, and as these often stem from our own geographical and socioeconomic backgrounds, they can be particularly dangerous when we're looking at political or economic decisions.

In groups, this can be even more dangerous – our own personal biases are supported by others and groupthink takes hold, with no one daring to question the group's perspective at the risk of seeming to be foolish or an outsider.

Three ways to diminish the risk of overreliance on analytics or biased forecasting are the use of premortems, devil's advocates and self-reflection. Tools that we all (including the market research organizations and newsrooms of the world) can implement more systematically to avoid shocks such as the Brexit result.

  • Premortems start with imagining that you are wrong, dead wrong, and that the worst has occurred. You then ask, what could be the cause of this predictive failure? Through this type of questioning, we can identify the limitations of the available data and dig deeper to improve the quality of the information used.
  • A devil’s advocate is appointed to ensure that contrarian positions have a voice at the table when groups are making decisions, but they are also useful on an individual basis. This person’s role is to argue against the group’s intention – essentially stating why everyone else is wrong. By clearly nominating someone to take this on (or by forcing yourself to question your own assumptions in this way), we free the advocate from the constraint of not wanting to go against the position of the group and in doing so allow them to highlight our collective blind spots.
  • Self reflection (by an individual or a group) is more of a habitual practice – ensuring that you think deeply on how your background, beliefs and socioeconomic context heavily bias your views. From the people you regularly interact with to the Facebook algorithm that pushes content to your stream, your view of the world is curated by your context. Forcing yourself to acknowledge this and actively seek out opinions counter to your own will diminish the influence your personal situation has on your decision-making, broaden your context and expand the range of data you’ll use to inform your decisions.

It's not that data and analytics are inherently bad or that our biases are not useful in decision-making, but rather that these can be flawed.

By recognizing and using a set of tools to overcome these flaws, we can be much more effective decision-makers and avoid (and perhaps profit from) the shocking and the unexpected.

Mike Ross and Davide Pisanu are co-founders and Blanche Ajarrista is an analyst at Montreal-based boutique consulting firm Juniper.

Interact with The Globe