On Aug. 16, Donald Trump held a rally in West Bend, Wis., where he announced that Hillary Clinton wanted to print millions of "instant work permits" for illegal immigrants – "taking jobs directly from low-income African-Americans." In fact, Clinton wants to give green cards to foreign students who have completed master's and doctorate degrees – letting them stay and work in high-tech jobs. (Jobs that would not, by and large, be available to "low-income" Americans of any race.) But never mind the truth. Trump's fabrications proliferate and prosper because he understands that, in the age of screens, drama trumps truth.
What matters is whether something feels like the truth. Does it feel like your job has been stolen? Does it feel like Hilary Clinton doesn't care about "real" people? An American who's watched economic inequality balloon since the 1970s may well feel these things. Such disenfranchisement, combined with a wild-west media landscape in which retweetable quotes take precedence over verifiable facts, produces the toxic intellectual environment that Daniel J. Levitin confronts in his smart, timely and massively useful primer for "critical thinking in the information age," A Field Guide to Lies.
As Levitin notes, we've created more information in the last five years than in all our preceding history. Meanwhile, we message, announce and declaim at an unprecedented rate. Unfortunately, the democratization of broadcasting systems brings with it a profusion of misinformation, half-truths and no-truths. Sometimes this is produced by what Levitin calls the "lying weasels" of the Internet but it's very often the result of simple confusions on the part of everyday people – confusions that his book means to dispel. New forms of media require new forms of literacy.
Unlike previous realities, Levitin writes, the age of screens has "no central authority to prevent people from making claims that are untrue." And so his book calls for independent analysis: Anyone who consumes easy, cheap, fast information must understand how to verify that information themselves. It should be clear by now that nobody else is doing it for you.
The joys of "the information age" are obvious; its pernicious qualities, while vaguely felt, are more difficult to detail. Meaning becomes lost in amalgamation; artful quotation alters original work; and, perhaps most damningly, the value of a viral story supersedes the value of a true one (this is news as meme). "Many of us learned of Michael Jackson's death from TMZ.com before the traditional media reported it," Levitin writes. "TMZ was willing to run the story based on less evidence than were the Los Angeles Times or NBC." The Internet promotes and encourages a lower standard of fact-checking and journalistic ethics, rewarding speed over veracity. Traditional journalists scramble to keep up, leading to debacles such as the Washington Post story that Pulitzer Prize-winner Jonathan Capehart wrote, which turned out to be based on a tweet by "a non-existent congressman in a non-existent district." Nobody stands outside the fog.
A professor of psychology and behavioural neuroscience at McGill University, Levitin has a mission not unlike the mission of scientists in the 17th century who sought to convince a disbelieving public that popular stories about the heavens were not always true. And his solution is the same as theirs; he proposes that we recall the basic principles of the scientific method. "The plural of anecdote is not 'data,'" he warns. We must look clearly, unsparingly, beyond what an argument makes us feel and learn to spot logical fallacies; we must learn to tell the difference between inductive and deductive reasoning, and learn how to read statistics. The "Field Guide" Levitin's created really is as instructive in these matters as its title suggests. At times it does veer toward a text-book tone but this is because the author sincerely means to instruct. Levitin's book is a primer for those ready to interrogate what they think they know.
There is a basic logical fallacy, for example, called post hoc, ergo propter hoc – basically, "B happens after A, so A must be causing B." This is the line of reasoning used by those who argue that the rise in autism can be blamed on vaccines (or WiFi, or GMOs). Between 1990 and 2010, the number of children diagnosed with autism did indeed rise six-fold. And a physician called Andrew Wakefield published a paper in the prestigious Lancet journal arguing new vaccines were to blame. Although his work has been exposed as fraudulent, the argument continues to be batted around online because (post hoc, ergo propter hoc) people think that correlation implies causation. Levitin helps us see the insanity of such arguments. Case in point: If I review this book and, the next day, Toronto is not levelled by an earthquake, does that mean my reviews are keeping the city intact? (The rise in autism, by the way, is actually accounted for by a widening of the definition of "autism" and the fact that people are becoming pregnant at later ages.)
Some of these mistakes in thinking are thanks to a fear of numbers. We are easily swayed by the use of "averages," for example. But averages can be wildly misleading. Levitin points out that "On average, humans have one testicle." Our confused understanding of basic terms such as "mean," "median" and "mode" allow partisan forces to shape statistics in any way that serves their purposes. Meanwhile, charts and graphs are also far from clear-cut. One could show that the number of crimes in a neighbourhood was "sky-rocketing" even while the actual rate of crime (the number of crimes for every 1,000 people) has gone down.
Even more troubling, though, are the mistakes we make when consuming rhetorical arguments. We blindly accept the opinions of the wrong people, forgetting that "intelligence and experience tend to be domain-specific, contrary to the popular belief that intelligence is a single, unified quantity." Such reasoning would have us give credence to the racist views of William Shockley, for example, because he was awarded the Nobel Prize in physics. Additionally, we fall prey to purveyors of "counterknowledge," which is misinformation that someone has actively clothed in the garbs of fact. It "runs contrary to real knowledge" but gets spread around because it has greater social currency than truth.
Throughout Levitin's book, it's the danger of social-currency-as-truth that most alarms. In an age of "truthiness," we all do well to heed that old maxim (often falsely attributed to Mark Twain): "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."
There are definite virtues to the democratization of knowledge and the massive uptick in online platforms – Levitin doesn't deny it. But what this book so expertly argues for is the recovery of quieter virtues that we've let slip along the way. Humility, for one: Do we still make room for "the things that you weren't even aware of that are supremely relevant to the decision you have ahead (the unknown unknowns)." This humility, says Levitin, is one of the most important parts of critical thinking. "When bridges collapse, countries lose wars, or home purchasers face foreclosure, it's often because someone didn't allow for the possibility that they don't know everything and they proceeded along blindly thinking that every contingency had been accounted for."
We've grown quick to outrage, quick to form online lynch mobs; we trade our opinions and "facts" as though they were beads at a bazaar. Levitin demands that we do better. And that doesn't just mean becoming good fact-checkers or savvy readers of charts and figures. It means taking up real, adult responsibility for our own minds' work. It means becoming critical of our deepest-set beliefs and, like the scientists that Levitin praises, shaping our opinions with the scalpel of honest exchange.
Michael Harris is the author of The End of Absence, which won the Governor-General's Literary Award for Non-Fiction in 2014. His next book, Solitude, will be published in 2017.