Sign up for our free newsletter
Free D.C. news, delivered to your inbox daily.
On May 20, 1995, a month after domes-tic terrorists bombed the Alfred P. Murrah Federal Building in Oklahoma City and killed 168 people, authorities placed temporary concrete roadblocks on the stretch of Pennsylvania Avenue north of the White House. The barriers have only become more permanent since then. There may be more potent symbols of America’s lost innocence: JFK’s assassination, the Challenger explosion, 9/11. But as moments of paranoia go, the Murrah Building bombing is exceptional, and W. Joseph Campbell rightfully pegs Timothy McVeigh’s act as transformative: It “signaled the rise of a more guarded, more suspicious, more security-inclined America, and of what can be called ‘a national psychology of fear,’” he writes in his new book 1995: The Year the Future Began.
Don’t let the TED Talk-esque subtitle trouble you. Campbell, a professor at American University, isn’t prone to making grandiose claims about the importance of 1995. But it was a year of interesting pivots in our national culture, and his book is built on brisk, nonacademic surveys of five of them: the McVeigh bombing, the rise of the Internet, the O.J. Simpson trial, the Dayton Accords ending the ethnic war in the Balkans, and Bill Clinton’s affair with Monica Lewinsky. On the face of it, some of these cases don’t seem capable of deeming the year “a clear starting point for contemporary life,” as Campbell writes. But what’s consequential about them now isn’t what people imagined would be then.
For instance, Campbell argues that the O.J. trial matters now for legitimizing DNA evidence on a national stage, more than the semi-serious discussion about race that pundits predicted at the time. And it’s practically axiomatic that the online world of 1995 has had a long reach. Household Internet adoption began its speedy surge; the launch of Windows 95 gave computers a then-unheard-of glitz; Amazon.com hung out a shingle; the browser wars gave dotcom startups a celebrity that’d become serially hubristic; and anti-web contrarianism arguably found its first flower in the form of scold Clifford Stoll. (“Why not send a fax?” he wrote.)
Campbell is less persuasive in other cases, though. The Dayton Accords, he writes, “launched the United States on a trajectory of increasingly forceful interventions abroad.” Even allowing that in 1995 Clinton acquired a backbone on the international stage that he’d lacked in Somalia and Rwanda, the USS Cole bombing and 9/11 did more on the force-launching front. (Even the Murrah Building attack may have mattered more, stoking anti-Middle East sentiment that dissipated only slightly after it emerged that the terrorists were homegrown.) As for his assertion that the Clinton-Lewinsky revelations “signaled a stark partisan divide that would deepen and intensify during the first fifteen years of the twenty-first century,” the likelier wedge is the government shutdown that enabled the pair’s assignations—or, apter still, Newt Gingrich’s speakerdom, which prompted the shutdown. Clinton’s bubba eruption got him impeached, sure. But who today makes anything of it besides compulsive enthusiasts of the #TCOT ilk?
However debatable Campbell’s points are, his book voices a larger, legitimate point: The forces that reshape history are rarely the ones you expect. Who knew that a Seattle book hawker named Jeff Bezos would reshape the online economy? Who would’ve figured that the O.J. trial would set the world up for CSI: Miami? Who could’ve predicted a stable Balkans, and an unstable Oklahoma? Or imagined a world where a fax is a punchline?