Polling techniques have come a long way since the days of door-to-door pollsters, hunting for information on soap or politicians. Today, online questionnaires, telephone polls and focus groups are tools of the trade and everything from sex lives to personal finances are fair game.
In Canada, the Marketing Research and Intelligence Association represents more than 1,800 pollsters and buyers of research, such as financial institutions, retailers and manufacturers, accounting for almost $750-million in research annually. Polls are widely used to craft policies, for marketing and business forecasts or before launching projects.
"Polling is a business," says Shachi Kurl, vice-president at Angus Reid Global. "The business sector is a big customer."
While company executives consult survey results such as The Most Influential Brands or Traditional Marketing Still Makes Cents, business professors aren't entirely sold on the pricey data.
Completed reports can sell for $10,000, while commissioning a survey or forming a focus group costs several times more.
"I discount a lot of the polls," says Werner Antweiler, associate professor at the University of British Columbia's Sauder School of Business in Vancouver. "There's no such thing as a perfect poll, just an approximation of opinion."
Sharing that opinion is Marc-David Seidel, also an associate professor at Sauder.
"There's so much data now, so many statistics," he says. "So, we hit our students early. We tell them, 'Don't take it at face value.' Every time they see a statistic, they should question what the underlying bias is."
In today's information-saturated environment, people have personalized digital feeds that flood them with content they choose, typically material they agree with. In an echo chamber of ideas, critical reasoning is dampened.
Aware of the glut, Dr. Seidel's commerce students complete several projects about not accepting information at face value.
For one assignment, students, using a news article where a poll is cited, are to find out who commissioned the poll, who funded it and who's analyzing it.
Certain polls, particularly political surveys, are designed "very deviously," employing the "push-polling" method, which uses loaded questions to create bias among voters, Dr. Seidel says.
In another exercise, students design their own data collection poll. Classmates who evaluate one another's work discover that each student unknowingly puts their biases into their poll. "If you understand the built-in biases, you can get closer to the message," Dr. Seidel says.
At Concordia University's John Molson School of Business in Montreal, a marketing research course addresses how information from polls affects managerial decisions.
In the course, students conduct their own research project, using surveys or field experiments to understand potential sources of error.
"The idea is that they may be consumers of marketing research projects and, by understanding potential sources of error, they will be better judges of the quality of such projects, from sampling to interpretation," says Onur Bodur, an associate professor of marketing.
When teaching about the perils of polling, Dr. Antweiler warns of leading questions, the order of the questions, how undecided respondents are treated, how many people participate and if the same questions were previously asked differently.
The Holy Grail of sampling is the costly random panel or focus groups, which typically target users of the product at hand, Dr. Antweiler says.
But less expensive online polling has gained prominence, he says. One acknowledged danger with online polling is that it doesn't represent the larger population.
Respondents aren't selected randomly so the results aren't probability-based. Sometimes, prizes are used to entice respondents, which can skew the results, drawing people motivated by dollar signs.
And, are people actually truthful? asks Dr. Antweiler.
Another challenge is that respondents don't like to admit that they do bad things. "There's the social desirability status," Dr. Seidel says. To counter that problem, a large pool of respondents will dilute the impacts of lying, ignorance or response bias.
Meta polls, which aggregate several smaller polls, errors included, are a solution. A super poll, with perhaps 6,000 respondents, gives a better picture than a single, 1,500-respondent survey, Dr. Antweiler says.
"Never trust a single poll. Give me 10 polls," he says.
Another polling method is "oversampling," where instead of the one-day snapshot from 300 respondents, a three-day picture emerges from 900 respondents. It's key to remember that poll results are truly snapshots, someone's opinion on one day, which can change one week later.
Polling companies are aware of challenges they face.
"We do a lot of monitoring," says Ms. Kurl. Oversight includes watching for those who speed through surveys or keeping tabs on frequent respondents.
Angus Reid has more than 130,000 Canadian responders, says Ms. Kurl. But in a world where people are ditching land lines, are selective when they answer their phones or can't be bothered to answer an online poll, Ms. Kurl says refusal rates for a poll can be as high as 90 per cent.
Still, Canada's competitive polling industry continues to serve customers who pay many bucks for many answers. "They can't afford to get it wrong," Dr. Antweiler says.
Remember 2012, when pollsters wrongly predicted that Alberta's Wildrose Party would defeat the Conservatives? Then in 2013, most polling firms erroneously said B.C.'s NDP would trounce the Liberals. Pollsters took a hit to their reputations and were forced to re-examine their methods.
"Polling is a very dangerous instrument if you base public policy on it," Dr. Antweiler says. "Don't just rely on polling."