The Big Picture
How
Buffy the Vampire Slayer Turned Me Into a TV Critic
What happens when your side wins the fight, the drunken cultural brawl that you’ve been caught up in for nearly two decades? And then the rules change, midway through? That’s the crisis that I’m currently facing, when it comes to the beauty and power—and lately, even the definition—of television as an art form.
When I first began watching television, there didn’t seem to be much to argue about. Like many children of the seventies, I grew up sitting cross-legged in front of a big console in the living room, singing along to
The Electric Company while my mom made Kraft Macaroni & Cheese. I dug
Taxi, I loved
M*A*S*H. In my teens, I memorized
Monty Python sketches with my friend Maria. But I also regarded TV the way that Americans had been taught to, since the 1950s. Television was junk. It wasn’t worthy of deep thought, the way that books or movies might be. It was something that you enjoyed, then forgot about. It wasn’t until my thirties that I had what amounted to a soul-shaking conversion, on the night that I watched Sunnydale High School principal Bob Flutie die, torn to bits by hyenas.
At the time, in the spring of 1997, I was a literature doctoral student at NYU, foggily planning on becoming a professor, maybe a Victorianist, but anyway, somebody who read for a living. Every morning, I woke up, flopped onto the sofa, and opened up yet another 900-pager. Across the room was an old-fashioned console TV, a dinosaur even for the era, with a broken remote control, so in order to watch my first episode of
Buffy the Vampire Slayer, I had to physically walk across the room, then click the circular dial over to Channel 11, The WB, a brand-new “netlet,” and then walk all the way back to the sofa.
Walking across the room to change the channel was still a normal thing to do, in 1997. It had been nearly sixty years since the first television (spookily nicknamed the Phantom Teleceiver) launched at the 1939 World’s Fair, and yet the medium was—with a few advances, like the addition of color and the still-tentative expansion of cable—not that different from what it had been in the 1950s, when families gathered to watch Milton Berle. Shows aired once a week. They were broken up by ads. When the ads were on, you peed. When they ended, someone in the other room would yell, “You’re missing it!” and you’d run back in. If you loved a particular show, you had to consult the elaborate grids in the print newspaper or in
TV Guide to know when to watch: “
ALF (CC)—Comedy. ALF is upstaged by a loveable dog that followed Brian home, so he gives the pooch away to a crotchety woman (Anne Ramsey).”
The main thing, though, was that television went away. It was a disposable product, like a Dixie Cup. Although scripted television hadn’t aired live for many decades, it still
felt live. You could watch rental movies on your VCR (and for a few years, they were everywhere) but most people I was friendly with didn’t regularly pre-program theirs to record much TV, because doing so was such a pain: spinning three plastic dials, for the day, the hour, and the minute. Each videotape held only a few hours of programming; rewinding and fast-forwarding were clumsy processes (and pausing might break the tape). There were no DVDs yet, let alone DVRs. Even if you were an early Internet adopter, which I was, dialing in was a grindingly slow, unreliable process—and when you
did connect, with the hostile shriek of static that we optimistically called a “handshake,” no videos showed up, just a wall of blinking neon fonts. Nothing, ever, arrived “on demand.”
This glitchy, ephemeral quality, and the ads that broke up the episodes, were a major part of TV’s crappy reputation. This part may be hard to remember, even if you lived through it. But just before the turn of the century—nearly universally, by default, and with an intensity that’s tough to summon up now—television was viewed as a shameful activity, as “chewing gum for the eyes,” to quote drama critic John Mason Brown. This was true not only of snobs who boasted that they “didn’t even own a TV”; it was true of people who
liked TV. It was true of the people who made it, too. TV was entertainment, not art. It was furniture (literally—it sat in your living room) that helped you kill time (it was how to numb lonely hours while eating a “TV dinner,” shorthand for a pathetic existence). TV might be a gold mine, economically speaking, but that only made it more corrupt. If you were an artist, writing TV was selling out; if you were an intellectual, watching it was a sordid pleasure, like chain-smoking. People still referred to television, with no irony, as “the boob tube” and “the idiot box.” (Some people still do.)
This is not to say there were no good shows. Critics praised (and, often, overpraised) the grit of
Hill Street Blues, the nihilistic wit of
Seinfeld, yadda yadda yadda. In the mid-’90s, there were several major breakthroughs in the medium, among them the teen drama
My So-Called Life and the sci-fi series
The X-Files. But among serious people, even the best television wasn’t considered worthy of real analysis. This was particularly true among my grad-school peers, the thinky guys whom I had privately nicknamed “the sweater-vests”—the men who were also, not coincidentally, the ones whose opinions tended to dominate mainstream media conversation. For them, books were sacrosanct. Movies were respected. Television was a sketchy additive that corporations had tipped into the cultural tap water, a sort of spiritual backbone-weakener.
The scripture for this set of thinkers was an essay by the writer George W. S. Trow, “Within the Context of No-Context,” which people recommended to me so frequently that it started to feel like a prank. A masterwork of contempt, “Within the Context” was a trippy string of koans that was initially published in 1980 in
The New Yorker. It came out in book form in 1981, then got rereleased in paperback in 1997, the same year that
Buffy the Vampire Slayer debuted. As Trow saw it, television was a purely sinister force. It was a mass medium whose mass-ness was its danger, because it conflated ratings with quality, “big” with “good.” The vaster television got, the more it ate away at the decent values of mid-century America—back when viewers were people, not demographics; adults, not children; capable of intimacy and proportion. “Television does not vary,” he wrote. “The trivial is raised up to power. The powerful is lowered toward the trivial.” Also: “What is loved is a hit. What is a hit is loved.” It was an elitist screed, nostalgic for an America that had never really existed, but it had a penetrating, pungent force.
On
Charlie Rose in 1996, novelists David Foster Wallace, Mark Leyner, and Jonathan Franzen struck an only slightly less apocalyptic note, spending nearly half of what was advertised as a panel on “The Future of American Fiction” denouncing television as “a commercial art that’s a lot of fun that requires very little of the recipient.” Their worries about television’s “kinetic bursts” were a precursor to the pseudoscientific rhetoric (“dopamine squirts”) that would later greet the Internet, once that stepped in as a cultural bogeyman. But then, they were part of a long tradition. In 1958, newscaster Edward R. Murrow had warned about the propagandistic dangers of the medium in his brilliant “box of lights and wires” speech. In the 1970s, popular jeremiads like Jerry Mander’s
Four Arguments for the Elimination of Television and Marie Winn’s
The Plug-In Drug diagnosed TV as an addiction. In the 1980s, the slogan “Kill Your Television” was a hip bumper sticker. At the turn of the century, watching TV was still widely seen, in the much-quoted (although possibly apocryphal) words of nineties comic Bill Hicks, as a spiritually harmful act, like “taking black spray paint to your third eye.”
There were occasional exceptions to this mood, among them Chip McGrath’s 1995 cover story in
The New York Times Magazine, “The Triumph of the Prime-Time Novel,” in which he praised
ER and
Homicide: Life on the Street for their “classic American realism, the realism of Dreiser and Hopper.” But as his title indicated, McGrath’s argument was just the flip side of the one made by the Charlie Rose panel. TV might, in fact, be worth watching—but only when it stopped being TV.
Copyright © 2019 by Emily Nussbaum. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.