:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/KGPZBWFCOBGIFCLL4J3TSCRQQ4.jpg)
Photo illustration by Bryan Gee/The Globe and Mail
Chris Frey is a five-time National Magazine Award-winner, the Toronto correspondent for Monocle magazine and the founding editor of Hazlitt.
In the wake of actor Luke Perry’s death this week, along with the many nostalgic tributes to his sideburns and his Simpsons cameo, there was some confusion on where the former nineties heartthrob was supposed to have stood on participating in the Beverly Hills, 90210 reunion scheduled to shoot this year. According to several news outlets, Mr. Perry had insisted he had no interest in revising his role as the show’s sensitive bad boy Dylan McKay, but in a public appearance last October in support of the show Riverdale, on which he had revived his career by playing Archie’s dad, he said he would love to do it.
The premise of the 90210 reunion, purportedly, comes with a twist – catching up not with the show’s characters all these years later, but quasi-fictionalized versions of its real-life actors as they try to get a reboot of the show off the ground. Which doesn’t seem like a particularly original move these days, such deliberate blurring of fact and fiction being a signature feature of both popular culture and our politics. But it might also possibly open the door to a more nuanced and self-aware look back at the culture of a decade, the nineties, which the show helped to define. It’s a decade we have a hard time remembering right.
We live in a moment already oozing with nineties nostalgia, this weekend’s release of the new Captain Marvel superhero movie, set in 1995 and replete with references to pagers, Blockbuster video and nineties-era internet culture, being just the latest example. (They have even digitally en-youngened Samuel L. Jackson to better resemble his nineties self.) Just try keeping track of all the reboots and revivals: Twin Peaks, Full House, Roseanne, The X-Files; all the hours that cable TV spends reinvestigating who killed Biggie and Tupac. The anthem for this moment arrived last fall, with Charli XCX’s music video for the song 1999. In it, Charli channels Kate Winslet in Titanic, TLC’s Left-Eye Lopes, Eminem, the Spice Girls and iconic moments from American Beauty and The Matrix. Charli may have been born in 1992 but that hardly stops her from singing of the nineties, “Never under pressure / Oh, those days it was so much better.”
Then came the Spice Girls’ announcement – well, most of them – that they’ll be reuniting soon for a tour. You can even buy tickets to I Love the 90s, a musical revue that has been touring the world these past few years. Thankfully, Rolling Stone magazine was still around to explain it all in the article “Why 2018 Was A Year Of Nineties Obsessions.” That obsession is likely to reach another pinnacle next month, when the 25th anniversary of Kurt Cobain’s suicide will be marked.
No matter how well or harshly you judge the decade’s music, films, fads or fashions, this recycling is largely to be expected, given nostalgia’s tendency to follow something like a 20-year cycle. Though it also feels increasingly like we’re living through the decade’s revenge, the nineties put through a Black Mirror random story generator. Pundits may wax on about how unprecedented our current political times might seem, but they’re not without their moments of woozy déjà vu – signal events and moments of the nineties recast and remade.
The recent U.S. government shutdown, over funding for Donald Trump’s theoretical border wall, backfired just as badly on the President as it did when Newt Gingrich, then the Republican House Speaker, first pioneered the tactic in the mid-nineties in his budget battle with president Bill Clinton. Prior to that, in late September, there was Christine Blasey Ford’s testimony before the U.S. Senate judiciary hearings on Brett Kavanaugh’s confirmation to the Supreme Court, alleging a sexual assault by the nominee when they were both teenagers, and how it recalled Anita Hill’s 1991 appearance before the same committee, then weighing the nomination of Clarence Thomas. Despite landing in the middle of the #MeToo moment, the result was the same, with Ms. Ford’s testimony discounted and Mr. Kavanaugh winning confirmation. As the journalist Danielle Tcholakian put it, “Those who lived through both Hill and Blasey Ford’s testimonies … would be forgiven for feeling that time is a flat circle” – one leading back to the nineties.
:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/SZM2VJCMDRHOBMDHHW7IAOWPUU.jpg)
Oct. 11, 1991: Law professor Anita Hill testifies before the Senate judiciary committee about allegations of abuse against Clarence Thomas, then a nominee for the Supreme Court.Anonymous/The Associated Press
So much of that decade’s politics – decisions made, issues unaddressed – haunt our current times more than any other. From societal issues of race and gender to the global economics of trade, from the radical transformations of the internet to the corrosive effects of growing political polarization, so many of the destabilizing forces that mark this current period in the United States and much of the West were either incubated, unleashed or amplified during that time. The list could go on, whether it’s our inability to address climate change as the urgent threat it is or the West’s troubled relationship with post-Soviet Russia. North Korea’s attention-getting over its missile and nuclear programs is an ongoing saga almost as old as The Simpsons, and nearly as ridiculous for going on for so long, having begun in 1994, when Mr. Clinton secretly considered launching a pre-emptive strike on the Yongbyun reactor. (All that was missing from the latest Trump-Kim summit in Vietnam was some bunting announcing the 25th anniversary of this show.) The ultimate revenge of the nineties is probably that the man now occupying the highest office of the United States was, through much of that decade, its most notoriously failed businessman – bankrupt and abandoned by his lenders, his name reduced to a punch line on late-night talk shows.
And yet North Americans tend to take a rather rosy view of the nineties, with some pronouncing them as “the last great decade.” That was the title of a three-part documentary TV series that aired on the National Geographic Channel in 2014. The following year, a New York Times commentary written by Kurt Andersen – an alumnus of Spy magazine, another fixture of the nineties zeitgeist – wore the headline, “The Best Decade Ever? The 1990s, Obviously.” Mr. Andersen went so far as to suggest that the decade “provoke[s] a unique species of recherche du temps perdu” that puts it apart from the usual cycles of nostalgia. “[L]ooking back at the final 10 years of the 20th century,” he wrote, “is grounds for genuine mourning: It was simply the happiest decade of our … lifetimes.”
Mr. Andersen seems to make a compelling argument on the nineties’ behalf. For much of the decade annual economic growth in the United States averaged around 4 per cent, a number it’s only since come close to matching now. Unemployment, he wrote, shrank to new lows, median household incomes grew by 10 per cent, and stocks quadrupled in value. Here in Canada, the decade got off to a rougher start economically, with the country going even more deeply into recession than the United States in 1990-91; and then, mid-decade, enduring a period of severe restructuring that saw dramatic cuts to social spending thanks to ballooning federal and provincial budget deficits. By the last years of the decade, however, Canada’s economy was in a similarly healthy position, in time to enjoy such other fruits of the nineties as Viagra and the internet, with all its democratizing, utopian promise. It was the decade, too, when we first became tethered to our mobile devices, with market penetration of cellphones nearing 40 per cent by 2000, and the way we consume culture and interact with politics began to evolve to the point we’re at now, after decades of staying more or less the same.
Mr. Andersen neglects to acknowledge how income inequality would spike during the decade, or that some combination of technological change and newly signed trade agreements would contribute to the hollowing out of the North American manufacturing base that working people depended upon. Near the end of his essay he allows that the decade was not without its problems – noting the failure to heed the growing danger presented by climate change, and how passage of the Financial Services Modernization Act of 1999, with bipartisan majorities in both houses of the U.S. Congress and the support of President Clinton, would help pave the way for the financial crash of 2008. Of these problems, however, Mr. Andersen blithely remarks, “But they weren’t obvious, so … we were blissfully ignorant!”
Which is entirely the point. In the introduction to his 2008 essay collection Reappraisals: Reflections on the Forgotten Twentieth Century, the historian Tony Judt argued that, in time, we would come to regard the period between the fall of communism (1989-1991) and the U.S. invasion of Iraq in 2003 as “the years the locusts ate: a decade and a half of wasted opportunity and political incompetence on both sides of the Atlantic.”
Quite suddenly, Mr. Judt argued, we fell under the notion that history could have little to teach us, except in the most narrow triumphalist sense. “With too much confidence and too little reflection we put the 20th century behind us and strode boldly into its successor swaddled in self-serving half-truths: the triumph of the West, the end of History, the unipolar American moment, the ineluctable march of globalization and the free market.”
It’s never very hard to find things to mock about what we once believed or enjoyed, even in the not-so-distant past. What’s uniquely striking about the nineties is just how deep our delusions went, how preposterously confident we were in our suppositions, jettisoning much of the history that the 20th century had taught us.
Rather, the eternal rightness of market economics, the nation-state’s declining relevance, the end of battles over ideology and the unipolar American moment – all were accepted as faits accomplis by much of the political and media class, as though it were the new permanent state of things.
Instead, two decades later, the liberal-democratic ideal appears more fragile and fractured than most of us imagined, even billionaires are questioning the viability of market capitalism and much of the 20th century’s baggage is washing back ashore. Since Mr. Judt’s death in 2010, his appraisal of the nineties only appears ruinously more so. Everything that seemed possibly bright and good about the decade may have just been a mirage.
:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/XTU27OP27JBAFKTHSYOTX5IP4E.jpg)
The nineties in film and television: Wayne's World on Saturday Night Live; Disney's The Lion King; Michael Jordan and Bugs Bunny in Space Jam; Seinfeld, the sitcom about nothing; Sarah Connor's return in Terminator 2; and Jurassic Park.NBC/Everett Collection, Walt Disney Co., Columbia/TriStar, Universal Pictures, The Canadian Press
:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/HH4KGIJGUJG2ZMZCB6C3XHLAPA.jpg)
The nineties in popular music: Gord Downie of the Tragically Hip; Australian rock band Silverchair; rapper Puff Daddy, as he then called himself; Québécois chanteuse Celine Dion; rap legend Tupac Shakur; singer Mariah Carey at the 1996 Grammys; Will Smith, accepting an MTV Music Award for Gettin' Jiggy With It; and Prince, shown performing in 1997.AP, Reuters, The Globe and Mail, The Canadian Press
How to break politics, American-style
Our sunny recollections of the nineties in the West owe much to the impression that its epochal political changes and drastic redrawing of maps occurred with so relatively little bloodshed. Few could have predicted that the fall of the Soviet Union and its client regimes in the Eastern Bloc would be so anti-climactic, coming not after some decisive military confrontation, but instead imploding out of internal malaise. Few expected Germany’s reunification to happen as swiftly as it did. Nor could many have foreseen the non-violent end of apartheid in South Africa, or a democratically elected Nelson Mandela as its president.
The notion that the decade represents some interregnum of relative tranquility, however, sandwiched between the last gasp of Cold War proxy conflicts and the conflagrations yet to come in Afghanistan, Iraq and Sudan – not to mention the most endless and ill-defined of all wars, the one against terrorism – would be false. The nineties, in fact, were plenty bloody. While some of the decade’s conflicts captured the West’s attention, such as the civil wars in Rwanda and the Balkans, and Russia’s brutal anti-insurgency campaign in Chechnya, millions more would die in the nineties than in the decade following 9/11, though largely in Africa and therefore off-stage from the West’s direct concern.
With fewer wars to fight abroad in the nineties, and its superpower status seemingly uncontested, America’s atavistic energies and antagonisms turned inward. If there was a war to follow it was the one playing out in Washington, between Mr. Clinton and Republican House speaker Newt Gingrich, who, with his populist-tinged “Contract with America,” led the GOP in 1994 to seize control of the House for the first time in 40 years. Mr. Clinton ultimately got the better of that rivalry. But in retrospect it was Mr. Gingrich who had the more lasting impact on the tenor and style of U.S. politics.
June 11, 1995: House Speaker Newt Gingrich and President Bill Clinton share a laugh at a Q&A session in Claremont, N.H.Jim Bourg/Reuters
Since arriving in Congress in 1978 representing a suburban district in Georgia, Mr. Gingrich had made it his mission to rebrand the GOP in his own more militant image, with an uncompromising approach to politics as total war. Calling himself “the most serious, systematic revolutionary in modern times,” he challenged his party’s leadership for its civility and acquiescence toward Democrats, and in the cover letter to a memo from 1990, entitled Language: A Key Mechanism of Control, he endorsed a call for his Republican colleagues to brand their Democrat rivals as “traitors”, “pathetic”, “corrupt”, “radical” and, of course, “socialist.” While his tactics and rhetoric often annoyed his senior GOP colleagues, with his disrespect for decorum and collegiality, they couldn’t argue with the fact he was getting results. By the time of the 1994 midterms, and two faltering years of the Clinton presidency, the Republican Party was all in with Newt. What followed wasn’t pretty – the Clinton-Gingrich battles defined by rancorous squabbling, partisan brinksmanship and the floating of conspiracy theories. Of course, Mr. Gingrich had an elevated sense of his motivations, once saying, “People like me are what stand between us and Auschwitz. I see evil around me every day.”
According to a recent study tracking trends in the partisanship of U.S. political speech going as far back as 1873, “the Republican takeover by Congress led by Newt Gingrich” was the country’s inflection point toward the entrenched partisanship and polarization we see in America today. Calling the 1994 midterms “a watershed moment in political marketing,” it cites two contributing factors: the influence of consultants, like the Republican pollster Frank Luntz, who helped Mr. Gingrich craft the most hot-button language that would resonate with voters; and changes in the media landscape that intensified the ambience of hyper-partisanship.
The decision by the Federal Communications Commission, in 1987, to stop enforcing the Fairness Doctrine – which required the holders of U.S. broadcast licences to cover issues of public interest in a manner that was “honest, equitable and balanced” – had unleashed a boom in conservative talk radio. By 1991, Rush Limbaugh was already the most syndicated radio host in the United States, and Mr. Clinton’s rise to the presidency provided the format with the perfect fodder to boost it to new heights. The founding of Fox News Channel in 1996 only added to the feedback loop of hotheaded fulminating. Then: the internet. It was the Drudge Report website that in January, 1998, first published an item saying that Newsweek was suppressing a story by one of its reporters about the President’s relationship with a White House intern. And lo, the right-wing blogosphere was born.
March 31, 1991: A video shot by George Holliday shows police officers beating a black man, later identified as Rodney King. The assault would trigger destructive riots in Los Angeles and protests against racial injustice.George Holliday/KTLA Los Angeles/The Associated Press
Meanwhile, America in the nineties was fragmenting on other fronts outside of Washington, in ways that echo uncannily still. The brutal, videotaped beating of African-American motorist Rodney King at the hands of four white Los Angeles police officers, and the riots that ensued following the officers’ acquittal in April, 1992, set off national dialogue about the unequal and often violent treatment of minority groups by U.S. law enforcement. Time-machine ahead to the riots’ 25th anniversary in 2017, and the country would find itself in the midst of another reckoning over the police killings of African Americans.
Further out on the fringes, the new polarization would explode into violence at an even more frightening scale, with the April, 1995, bombing of the Alfred P. Murrah Federal Building in Oklahoma City, Okla., by Timothy McVeigh. A Gulf War vet turned anti-government extremist, Mr. McVeigh had connections to the then-emerging Patriot movement, a mostly informal collection of right-wing militias, hard-core survivalists and white identitarians that would serve among the forerunners of today’s “alt-right.”
In the wake of the Oklahoma City bombing, U.S. law enforcement upped its tracking and infiltration of right-wing extremist groups – that is until 9/11 came along and the entire national security apparatus became exclusively focused on the threat of foreign-sourced jihadi terrorism. As a New York Times investigation revealed last fall, by the time of the alt-right rally in Charlottesville, Va., at which one counterprotester was killed, there was a dearth of useful intelligence on the movement.
It’s illustrative of just how narrowly the al-Qaeda attack on Sept 11, 2001, and the subsequent wars in Afghanistan and Iraq, would come to consume America’s attention to the detriment of growing systemic fissures – until, with first the 2008 economic crisis and then the recent populist backlash, those cracks opened underneath us. If sometimes it feels as though the world we’re now living in is like some Freudian return of the repressed, what’s returned is the unfinished business of the nineties.
:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/KT4OFQVVYVEV5DL2DDYNCMWYZY.jpg)
The nineties in Canadian politics and social change: The Oka crisis, the evolution of Preston Manning's Reform Party into the Canadian Alliance, Elijah Harper's stand against the Meech Lake Accord, the prime ministerships of Kim Campbell and Jean Chrétien, the national-unity rally before the 1995 referendum on Quebec sovereignty, the beginning of Adrienne Clarkson's tenure as governor-general, the creation of Nunavut, the Ralph Klein era in Alberta.The Globe and Mail, The Canadian Press, Reuters, AFP
Nineties as prologue
Here in Canada, the polarization may not seem quite so shrill or severe, though it’s also, possibly, just delayed. The patterns since the 1990s have not been dissimilar. Just as Mr. Gingrich’s ascent to the Republican leadership signalled a more rightward, socially conservative and populist shift in U.S. politics, so did the splintering here of the federal Progressive Conservatives into the Reform Party, which stood candidates nationally for the first time in 1993, while Stephen Harper’s time as Prime Minister effectively purged any remnants of Red Toryism from the conservative movement.
At least Canadians can attribute some of the change in tone and civility to a shared cultural space and media spillover across the border. How did polarization also come to afflict so many countries across Europe, as well as democracies as far apart as the Philippines and Brazil, mostly paving the way for right-leaning autocrats and parties to win or consolidate power?
The strain of reactionary populism we see from Mr. Trump, Brazil’s Jair Bolsonaro, Viktor Orbán in Hungary and Poland’s Law and Justice Party would seem to thrive in today’s volatile environment of diminished social cohesion and rising economic uncertainty, shouting simple answers to complex problems. Memories of the 2008 financial crisis – and the perception that elites, despite causing the crash, ultimately prospered from it, while the rest of us struggled through austerity or, at best, a sluggish recovery – provide today’s populists with ample grievances to exploit, which many adulterate with a host of other resentments, whether over immigration, cultural identity or the pace of technological and demographic change.
And here again the nineties are instructive, both in cause and effect. The neoliberal economic policies toward increased trade, financial deregulation and cuts to social spending first adopted by conservative governments in the United States and Britain in the eighties, would in the nineties, through institutions like the World Trade Organization and World Bank, establish much of the framework for post-Cold War globalization.
Conservative commentator Pat Buchanan, shown in 1992.Dennis Cook/The Associated Press
In the United States, the dislocations wrought by the decline of the industrial base, and growing income disparities, would gradually give opposition to globalization and an ambient suspicion of international entanglements – military and otherwise – fertile ground for expansion on the right. Pat Buchanan, a former adviser to presidents Richard Nixon and Ronald Reagan and one of the original cable-TV pundits, anticipated this shift with his insurgent candidacies for the GOP presidential nomination in 1992 and 1996, creating a space on the conservative right for a more militant candidate on immigration, abortion and other social issues, but one who also – against Republican orthodoxy – opposed trade deals like NAFTA and globalization generally, which he blamed for stealing American jobs.
Mr. Buchanan might be considered the spiritual godfather of today’s Trumpists, Brexiteers and anti-globalists. But he and his heirs were helped along by the so-called triangulations of Mr. Clinton and Tony Blair, who largely abandoned their parties’ traditional support of labour and implemented policies that embraced the neoliberal trend rather than challenging or redressing it. Fast-forward to 2016, and the Trump campaign’s last television ad before election day, in which the candidate accused a “global power structure” of having “robbed our working class, stripped our country of its wealth, and put that money into the pockets of a handful of large corporations and political entities.” It’s the kind of thing you might have heard the anti-globalization protesters predicting at the 1999 WTO meetings in Seattle.
The same year as those Seattle protests, Mr. Trump briefly mooted a run against Mr. Buchanan for the Reform Party nomination in the upcoming presidential election. Mr. Trump agreed with Mr. Buchanan that trade deals were a rip-off for American workers and that elites were to blame. But he was appalled, he said, by the nativist tone of Mr. Buchanan’s rhetoric, calling him a “Hitler lover,” and that white nationalists like former Ku Klux Klan grand wizard David Duke were flocking to the party. Mr. Trump bowed out in the end, while Mr. Buchanan would go on to take only 0.4 per cent of the popular vote in the presidential election. Fifteen years later, under the influence of Steve Bannon, you could have accused Mr. Trump of plagiarism for how closely his winning campaign resembled Mr. Buchanan’s in 2000. (So much so that David Duke voted for him.) But all that seemed more like sideshow than prologue at the time.
:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/X7XVTMS5GREPXJZJTX57NDZGGM.jpg)
The nineties in global protest: 1999's Battle in Seattle at the World Trade Organization summit; 1993 sit-ins demonstrating against logging in B.C.'s Clayoquot Sound; a 1998 protest after the Vancouver APEC summit, with masked demonstrators mocking Indonesian president Suharto and Canada's then prime minister Jean Chrétien (shown dousing a demonstrator with pepper, a reference to a joke he made about pepper-spraying).AP, Reuters
The forgotten opposition
Despite being derided and ridiculed by much of the media and political class at the time, the anti-globalization protesters in Seattle would turn out to be right about many things. Principally, their fears about job losses and downward pressure on workers’ incomes, while capital, able to circulate more freely across borders, would enjoy increasingly higher returns, exacerbating the trend toward greater income inequality. The environmental contingent warned that WTO policies would not only increase pollution and emissions in the developing world, but pressure the West to lower standards as well – which did not exactly happen, at least not until Mr. Trump took office, and likewise Doug Ford in Ontario.
To be fair, the protesters did not foresee how trade would lift millions out of poverty outside the West, including in China, which as a result of decisions made in Seattle would be granted accession to the WTO in 2001, putting it on the path to becoming the world’s largest economy. But if you were to hand out score cards, the protesters’ instincts proved more accurate than much of our political and media class, who assured us that the world would become a less volatile, more peaceable place, as economic integration replaced rivalries both ancient and modern with mutual self-interest, and countries worked multilaterally toward addressing chronic problems of poverty and insecurity – a fanciful thought in retrospect. We still have a form of globalization, just not the one then imagined, with the West in a position to preach policy to the rest. Rather we have something more fragmentary, multipolar and episodic, based not on universal principles or high-minded values, but mutually reinforcing political or economic interests and shared antipathies.
The reunification of Hong Kong with the mainland in 1997 had many respected China watchers predicting that the openness and dynamism of the former British colony would eventually help loosen the reins in Beijing. A New York Times columnist called Hong Kong a “colossal Trojan horse” that could destabilize the regime. And Hong Kong or not, it was argued, increased wealth and living standards among the Chinese would anyway lead to more demands for democracy and personal freedom. The argument’s flip-side was that a lack of liberalization and openness would put a ceiling on China’s development and aspirations to global leadership. It’s only now, under the autocratic control of Xi Jinping, and one of the most oppressive periods in the country since Mao Zedong, that those same observers have recanted their earlier forecasts. As an Economist feature from last year put it, “The West has lost its bet on China, just when its own democracies are suffering a crisis of confidence.”
:format(jpeg)/arc-anglerfish-tgam-prod-tgam.s3.amazonaws.com/public/WAKZRP7QWVCM7FBOUHLJ6AKF4Q.jpg)
The nineties in digital culture: Sonic the Hedgehog on the Sega Genesis; an event at Toronto's CN Tower marking the release of Windows 95; Sandra Bullock in the tech-based thriller The Net; the late Steve Jobs, holding an iMac computer in 1998; finance minister Paul Martin in 1995, brushing up on his computer skills for an upcoming federal budget, the first to be made available online; and Netscape Navigator, once the world's most popular Internet browsers.Reuters, The Canadian Press, AP
Lost decade
One of the quainter predictions made in the nineties, no doubt influenced by the post-Cold War mood, when the West thought it had triumphed over authoritarianism, was that the internet would inevitably lead to more political freedom around the world. It would be all but impossible for a repressive regime to censor or surveil its people, so the argument went, when they could communicate and co-ordinate peer-to-peer, circumventing traditional or state-controlled media. For a good while that appeared to be the trend of things, from the early online mobilizations of primarily left movements to the infant years of social media (Facebook b. 2004, Twitter b. 2006) – the internet and mobile technology empowering dissident and rights-based groups, giving them platforms on which to co-ordinate actions and build global alliances. Autocratic leaders would be held more closely to account, even possibly toppled, such as the 2011 Tahrir Square protests that brought down Egyptian dictator Hosni Mubarak.
Instead, as Zeynep Tufekci wrote last year in an article for the MIT Technology Review, “digital technologies have gone from being hailed as tools of freedom and change to being blamed for upheavals in Western democracies – for enabling increased polarization, rising authoritarianism, and meddling in national elections by Russia and others.” Even if the internet strengthens the ability of social movements to organize and co-ordinate, it has likewise amplified the power of other actors, governmental and not, foreign or domestic, to target opponents for abuse, spread misinformation or stage distractions, thereby weakening social trust to the point “that everyone is too fractured and paralyzed to act.” Or, in the case of China and increasingly Russia, they are simply sealing off their internet from the rest of the world behind a firewall. (A draft bill currently under debate in the Russian parliament would require all internet traffic move through servers physically based in the country, and increase funding for the office responsible for monitoring it.) As for Facebook and Twitter, who were not long ago celebrating their roles in facilitating the toppling of dictators? Seems they were meanwhile “solidifying their technical chops for deeply surveilling their own users,” Ms. Tufekci writes, and then selling the data.
It’s hardly the wired world imagined by John Perry Barlow in 1996, when he wrote “A Declaration of Independence of Cyberspace,” arguably the internet’s foundational ideological document, especially among the budding disruptors of Silicon Valley. Extolling his vision of a self-regulating internet, free from government intrusion or control – while omitting, conveniently, it was largely built with government money – Mr. Barlow’s manifesto was soon republished on some 40,000 websites. What’s most perplexing rereading the document today, aside from being awash in premillennial techno-utopian hokum, is its exclusive focus on the threat of government, rather than, say, private corporations, such as the ones who would come to dominate the internet. When it’s the latter, at least in the West, which have proved an equal, if not greater, threat to citizens’ privacy and online security.
John Perry Barlow, shown in 2002, authored "A Declaration of Independence in Cyberspace."Jack Dempsey
One could forgive Mr. Barlow’s omission, and, like Kurt Andersen, say, hey, it was the nineties, how should we have known? Except that the manifesto’s profound influence on Silicon Valley’s image of itself gave justification to tech platforms’ long-standing resistance to regulation of any sort. Attempts at government intervention or oversight, they argued, would “stifle innovation,” as though it was the worst of all secular sins. Only now, after much damage done, have governments in the West begun to contend with the myriad ways that the internet has distorted or imperilled our political discourse.
Later in his introduction to Reappraisals, Mr. Judt anticipated how the emerging architecture of globalization in the nineties, technological disruptions and erosions to the welfare state in the West, would give rise to today’s wave of populist autocrats, writing that its most avid proselytizers “may be in for a surprise, as populations in search of economic and political security turn back to the political symbols, legal resources, and physical barriers that only a territorial state can provide.” He foresaw, too, how “Fear is re-emerging as an active ingredient of political life in Western democracies”, and warned that the combustible politics of insecurity and polarization created the conditions for a turn to authoritarianism.
Rather than the end of ideology, as famously suggested by Francis Fukuyama’s 1992 book The End of History and the Last Man, the nineties were instead afflicted by an ideological tunnel vision – a growing consensus in the West around a set of ideas about economic integration, governance, technological progress and the deregulation of finance that betrayed a rather blinkered or selective understanding of history. As a historian, what seemed to bother Mr. Judt most was “the idea that we live in a time without precedent: that what is happening to us is new and irreversible and that the past has nothing to teach us.” But something always happens next; history never trends only one way forever, because just like the individuals who people it, each historical moment comes freighted with its own blind spots, delusions and internal contradictions. (To wit: Mr. Trump wins power by attacking elites, then grants the wealthy the largest tax break in U.S. history.) If history is anything it’s ironic, sometimes happily but more often painfully, with some new correction just around the corner.
Which should give us some cause for optimism about the current moment. While one must be clear-eyed about the challenges we face – indeed, multilateral co-operation would seem to be collapsing just when it’s most needed, especially on the climate-change front – we hardly need resign ourselves to the apocalypse. The repudiation of Trumpism in the U.S. midterm elections in November surely augurs that something is afoot. But neither is it inevitable. Autocratic populists may yet gain more strength; perhaps in some countries and not others. Though it’s also possible they merely represent the noisy last gasp of a receding world view, a final pushback against even larger gathering trends.
As for the nineties and what to make of them in the present ambience of nostalgia? These days the expression “lost decade” is most often used to describe Japan’s experience of those years, the onset of a prolonged period of stagnation just as some were predicting the country would soon eclipse the U.S. as the world’s largest economy. But the West would prove no less lost, in both its delusions and opportunities missed.
Clockwise from top right: Luke Perry, Shannen Doherty, Jason Priestley, Gabrielle Carteris, Tori Spelling, Brian Austin Green, Jennie Garth and Ian Ziering in Beverly Hills 90210.Andrew Semel