Skip to main content
managing

Nate Silver predicted the 2012 U.S. presidential election with ease, using his analysis of polls to get every state right. But in 2016 the data expert failed.

Or did he?

His analysis pointed to a Hillary Clinton win, which didn’t happen. She did attract more votes than Donald Trump, but not by as wide a margin as Mr. Silver foresaw, and, crucially, she lost three states he saw in her camp: Pennsylvania, Michigan and Wisconsin. But, before Election Day, he kept warning that Mr. Trump could win – simulations pointed to about a 65-per-cent to 70-per-cent probability of Ms. Clinton triumphing, which meant that there was about a one-third chance Mr. Trump could win. Only we didn’t listen. It just seemed like he was being needlessly academic – she was clearly ahead, so she would win.

That’s a reminder of the importance of understanding, and accepting, probability: one of the many ways we go wrong in handling data, in our data-enriched world. This is important because data can be the foundation for strategy and decision-making, and mistakes in both.

Indeed, one of the threads I see in recent writings on those topics is the importance of considering the probability that your decision is right (or wrong). Often, there is doubt when strategy is hammered out, but leaders move on to implement, blinkered and start to believe wholeheartedly in what was once a theory. What if you assigned a probability that the strategy won’t work?

In January, Walter Frick, a senior editor at Harvard Business Review, echoed this when he looked at what the magazine had published on decision-making over the years and came up with three rules:

  • Be less certain: Overconfidence is the bias that Nobel-prize-winning psychologist Daniel Kahneman has said he would eliminate first if he had a magic wand.
  • Ask, “how often does that typically happen?”: If you are considering an acquisition, for example, ask how often success occurs in such situations?
  • Think probabilistically: He says research has shown that even relatively basic training in probability makes people better forecasters and helps them avoid some cognitive biases. “Improving your ability to think probabilistically will help you with the first two rules. You’ll be able to better express your uncertainty and to numerically think about ‘How often does this usually happen?’ The three rules together are more powerful than any of them alone,” he stresses.

The problem is that data today often make us feel more certain. We hear stories of success by database, such as the analysts who discovered that men were buying diapers and beer late at night in supermarkets, so those goods should be placed closer together. But we need to be cautious.

Celebrated management thinker Peter Drucker told us that what can’t be measured can’t be managed. McGill University professor Henry Mintzberg loves to have fun with that idea, pointing out that we can’t measure management and we can’t measure measurement – so beware of managers obsessed with measurement.

Most businesses have detailed data on sales, market-research studies on behaviour of customers and an overflow of information from their website on what is being studied and bought. Amazon is a paragon in this field, but when I mention in conversation that it has been dead wrong in its attempts to lure me to further purchases, other people tend to agree. Many websites feed us images of items we searched Google for and decided not to buy. Maybe that’s overconfidence on our part, but if the leading data-analysis companies are flailing, that’s worth considering in our own more modest efforts. Another danger is assuming that the seductively detailed analytics your website coughs up can tell you something about your other customers. Not necessarily.

Consultant Roy Williams warns about the tendency to confuse statistical correlation with causation. Another limitation he flags in his Monday Morning Memo is that data cannot tell you the right thing to do; they can only tell you the result of what you have done.

We can also wield data as a cudgel, to prove our point. That makes our decisions evidence-based, we claim. But Mr. Williams counters that all decisions are evidence-based; the crucial factor is whether you have interpreted the evidence correctly. “The intellect can always find logic to justify what the heart has already decided. Consequently, data is often used in the same way that a drunk man uses a lamppost; for support, not for illumination,” he notes.

So sober up on your use of data. Be less certain, more accurate.

Cannonballs

  • Research shows that people are more likely to follow advice when they think it comes from an algorithm than from a person.
  • Fighting back against LinkedIn, ad contrarian Bob Hoffman listed his job as chief aggravation officer. He was recently advised that LinkedIn has filled 5,475 chief aggravation officer jobs. I searched and found over 1,500 job results for that post. Hmmm.
  • McKinsey & Co. warns that a red flag with advanced analytics comes when the executive teams lacks a solid understanding of the difference between traditional analytics, which involve business-intelligence tools and reporting, and the more advanced possibilities through predictive and prescriptive tools such as machine learning. The consultants also urge you to hire analytic translators who can help business leaders indentify high-impact analytical approaches and then explain the situation to data scientists.

We’ve launched a new weekly Careers newsletter. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe