Daniel Woolf is principal emeritus of Queen’s University and a professor in the history department.

There are many good candidates for the word of the year from the bizarre 2020: coronavirus; Trumpism; Brexit. But for me the leading candidate is a commonplace one: “normal.” Over the past year, the word has popped up (along with phrases such as “back to normal” and “new normal”) everywhere from mainstream media to social media. But what does it mean and where does it come from?

Norma, the Latin word for a carpenter’s square, migrated into Europe’s vernacular languages during the Middle Ages, showing up principally in the form of “enormity,” mainly to denote monstrous size. But there was really no distinct concept of a “normal” as opposed to a “natural” order of things.

Story continues below advertisement

This began to change by stages, starting in the late 17th century. Increased secularization, skepticism toward things that seemed to defy both common sense and reason, and a greater reference to observed experience in assessing the truth or falsity of reported events played a part. So did the advent of what historians sometimes called “classical probability,” which assessed the likelihood of certain outcomes repeating themselves – for instance, in games of chance, and in a nascent insurance industry. Still, the word “normal” and its antonyms were relatively rare in everyday speech.

Around 1800, another inflection point was reached in the entry of the word “normal” into everyday speech. The French Revolution and its Napoleonic aftermath was perceived by many contemporaries as an outlier event – something so unprecedented in world history as to defy, at least initially, reason and explanation. There had certainly been world-shaking events before: the fall of Rome and the discovery of the Americas being two favourites for Enlightenment-era historians. But reports of these events did not have the advantage of the relatively rapid distribution of news – something that would only begin to appear in the 19th century with the advent of the telegraph. And they were by 1800 quite distant: the abnormal seems much less abnormal the further back it is in history’s rear-view mirror. Even the Revolution became less shockingly abnormal as time passed, and 19th-century historians had greater opportunity to reflect, explain and compare, aided by further revolutionary outbreaks in 1830 and 1848.

The other major development in the usage of the word “normal” was the transition from 18th-century probability into its more statistics-based 19th-century successor. The gathering of statistics had long been a key element in the formation of the centralized monarchical state and in the expansion of empires, and it became mathematized at this time. This, so far as the entry of “normal” into everyday speech, was the game-changer. During the century, early statisticians such as Belgium’s Adolphe Quetelet and Britain’s Francis Galton attacked the problem of normality in different ways; Quetelet tended to focus on central tendencies, while Galton explored variations in human characteristics and their heritability.

The concept of what Quetelet had called the “average man” came into use among early social scientists. The “normal” distribution, or bell curve, largely the work of German mathematician C.F. Gauss, also appeared. As noted by historians of probability and chance such as Lorraine Daston, “normal” also displaced “natural” in diagnoses of health during this period – thereby giving rise to our current culture of ableism, and to the binary distinction of “normal” from “pathological.”

Story continues below advertisement

As with many words that come into speech by fits and starts, “normal” and “norm” carry multiple senses, dependent on audience and context. For some, “normal” is a simple descriptive term, a central tendency, a statement of fact. But inevitably, the words acquire a moral or political value: “She’s not normal” or “can’t we be like a normal family?” Sentences such as “heterosexuality is normal” can no longer be understood simply as meaning that “the norm” in sexual relationships is between a man and a woman. That is now deemed “heteronormative” because it is not simply a statement of statistical occurrence, but appears to value one form of sexuality as a standard as opposed to others. “Deviation” (also a statistical term, as in “standard deviation”) and especially “deviancy” carry a connotation of flaw, failure or imperfection, either natural or moral.

Why has “normal” become so commonplace this past year? The shocks of an out-of-control American president (frequently described as “norm-busting”) and the “new normal” of wearing masks, physical distancing and conducting meetings by Zoom are the obvious triggers, but there is a deeper psychology at work. In times of crisis, it is natural (or perhaps “normal”) to crave stability, whether it be a return to a vanished old way of doing things or at least settling into some new, but predictable, way of life. The phrase “new normal” is a coping strategy, our way of admitting that things may be irreparably changed, but that they are eventually going to settle down, at least for a time.

And, as with the French Revolution and the fall of Rome, time itself has a normalizing effect. With investigation, analysis and understanding comes a contextualization – a renormalization of that which has previously been considered normal. This is by no means always a good thing: witness the effort by certain German historians in the 1980s to “normalize” the Holocaust by arguing that it was not very different from other forms of atrocity during wartime.

So next time you read that something is not normal or that we need to “get back to normal,” keep in mind that the concept has a history, and that for the most part, what is considered “normal” often doesn’t stay normal for very long.

Story continues below advertisement

Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.