Skip to content

R.I.P., Kill Your TV

Thinking outside the idiot box

When HBO announced it was making a movie of Fahrenheit 451, it was a bit of a headscratcher. Ray Bradbury’s novel depicts a society gone numb on endless loops of home entertainment. Why would HBO, the nation’s great pioneer and name-brand purveyor of endless loops of home entertainment, produce a devastating critique of itself—that is, of television, a medium that Bradbury went to his grave calling an “insidious beast, that Medusa which freezes a billion people to stone every night”?

Only one answer was imaginable: HBO would do no such thing. It would instead gut Fahrenheit of its core idea, Kill Your TV, and remix the dystopia as an extended, slightly edgy ad for Barnes & Noble Classics. The network would switch out the primacy of the story’s ubiquitous television screens in favor of non-screen, non-entertainment gadgets like smart speakers. A remake along these lines would be roughly akin to a GlaxoSmithKline production of Aldous Huxley’s Brave New World that used non-pharmaceutical stand-ins for the futurist all-purpose narcotic that Huxley called soma. In both cases, the deviations would serve to soften or remove any unsettling echoes with the latter-day cognates of the original plots—mood-enhancing drugs in Huxley’s case, and “peak TV” in Bradbury’s.

HBO did indeed deliver a Fahrenheit devoid of television—a Jaws without the shark. In the script written and directed by Ramin Bahrani, Bradbury’s “insidious beast” barely makes an appearance, much less turns anyone to stone. Cut from the story is a central character, Guy Montag’s TV-addicted wife, Mildred, who spends her waking hours engrossed in interactive serial dramas displayed on three giant “parlor” screens. In the original 1953 novel, Montag is distraught but resigned to the fact that “No matter when [he] came in, the walls were always talking to Mildred.” It’s during one of his wife’s regular viewing parties that Guy finally snaps, setting him on course to torch his boss with a flamethrower and flee the city’s boundless warren of living-room home box offices.

François Truffaut faithfully reproduced this pivotal scene in his 1966 adaptation of Fahrenheit, but the HBO version finds Montag curiously unmarried. The holographic parlor screens in his tranquil space-age bachelor pad appear only briefly; they feature no loud sitcoms or dramas, but a series of stills from a search on the history of firemen. In the monitor built into the bathroom mirror, a futuristic social media feed displaces the dramatic fare spit out by the televisions of the original story. Other depictions of indoor screens in the HBO remake are few, notably including a vintage Zenith-style turn-dial used by a resistance cell to screen video manifestos.

None of the critics who mostly panned the remake found it odd that HBO’s Fahrenheit presents a TV set as a symbol of the Resistance. Then again, since they most likely regard 30 Rockefeller Plaza as a strategic bunker in their own Trump-age version of the Resistance, why would they?

Boxed In

A decade into television’s remarkable cultural rehabilitation, the Kill Your TV politics of Fahrenheit 451 is widely judged a fuddy-duddy non sequitur. The new consensus includes the left-leaning tradition of culture criticism, traditionally the incubator and trustee of the honorable project of studying and opposing television as a powerful and insidious instrument of ideological control and mental stupefaction. A generation raised on sponsored MSNBC and Daily Show clips, Netflix and HBO Go shows slyly saturated with product placement (now called “integration”), and social media feeds thick with video ads has convinced itself that “watching TV” is something their parents do, not them. They construct their viewing calendars a la carte style, using different delivery systems, you see. This has somehow magically changed television content into something that is not television content. They don’t veg, which sounds passive and bad. They binge.

By escaping its box and slipping into every crevice of our lives, television has performed the devil’s trick of making us believe it doesn’t exist. As its physical body thins and melts away, it becomes less enticing as an object begging to be smashed, should anyone still want to smash it. Can there be a boob tube without a tube? An idiot box without a three-dimensional box?

And if television no longer exists, neither do the criticisms long leveled against it. They become cultural relics, like rotary phones, fit only for basement storage in boxes full of Jello Biafra spoken-word cassettes, or maybe usable as punchlines in YouTube clips of Generation Z brats staring blankly at artifacts of the pre-Facebook world. We have cut our cords, to the walls and to the past alike.

The critics’ reactions to the HBO Fahrenheit’s deletion of television served as a clarifying gauge of this shift. Mostly, the response was crickets. The few journalists who even noticed at all praised the choice in the interest of contemporaneity. As New York Times critic James Poniewozik put it, Bradbury’s focus on mass media and television has been rendered an “outdated . . . specifically 1950s” concern, one that’s been “flipped” by the rise of “niche and social media.”

HBO did indeed deliver a Fahrenheit devoid of television—a Jaws without the shark.

Has it, though? According to Nielsen, the biggest spike in TV viewing history took place during the first decade of the 2000s; by its end, the average household had one or more sets running nearly nine hours a day. The number has since settled at around eight hours. Leaving aside the small, nimble screens featuring “niche and social media” that in fact mostly repurpose television content, you’re still left with a volume of daily TV viewing equivalent to the average workday. Indeed, eight hours is nearly double the amount of TV Americans consumed when Fahrenheit appeared on new release tables alongside Junky and Go Tell It on the Mountain. Think what you want about Bradbury’s warnings of an antennaed Medusa, but these are not numbers you associate with an “outdated, specifically 1950s concern.”

Just over a week after HBO premiered Fahrenheit 451, The Atlantic published a straight-faced article titled, “18 New Shows to Watch This Summer.” You could have found similar fare in two dozen other magazines and websites, whose TV writers, working one of journalism’s only growth fields of the last decade, struggle valiantly to keep pace with the sustained revelation of TV’s second “Golden Age.” For our new cohort of critical gatekeepers, this blessed epoch’s riches are so embarrassingly deep they render the warnings of last-century’s gray-beard social critics and sci-fi authors obsolete. Anyone who denies that this is a wondrous time to be alive and watching television is not just an asshole, but an asshole who is morbidly password- and satellite-dish deprived. Against this consensus, any artillery aimed at screens is reserved for the smaller ones associated with social media and the internet—though not because they have become extensions of television, supercharging its mutation into a shape-shifting T-1000, making it harder to escape, to smash and to kill, than ever before.

Here’s the funny thing about the HBO Fahrenheit’s nervous elision of television under the guise of renovation: in Bradbury’s version, the TVs didn’t have tubes, boxes, or cords, either. The TVs in the novel are connected to each other via an internet-style network called a “circuit.” They are, in other words, a prescient imagining of what marketing researchers today call social television. An early caricature of today’s media-driven acolytes of the “fear of missing out” cult, Mildred Montag desperately consumes the same popular shows as everyone else, mainlining the temporary illusions of community and belonging they provide, pausing only to pop pills and wonder who else is watching. [1] Mildred’s plan to buy a fourth screen for the living room’s remaining empty wall even anticipates the ad industry term—“the fourth screen”—for the smart phones and tablets that cover every interior inch of our waking lives. Jules Verne flubbed his prediction of television’s arrival and omnipresence by some eight hundred years; Bradbury correctly foresaw television morphing and multiplying two generations out from I Love Lucy. His mistake was assuming the screens would only get bigger, not bigger and smaller at the same time.

Golden Globs

With every crank of the media history wheel, clamor over one medium disappears into the uproar set off by the next. Fierce debates raged around novels, newspapers, and comic books. Radio was once a source of great fear and suspicion, blamed for dumbing down culture, threatening democracy and producing Hitler—until television came along. Now the internet has moved our focus down the line, retroactively bathing television in a flickering amber glow. TV is our friend from a simpler time, when Walter Cronkite guarded the news and everyone watched the same Thursday Night Movie. Those were the days.

This shared nostalgia is a false memory. A roaring debate raged around TV into the new millennium, surviving both The Sopranos and the rise of prime-time liberal stars on MSNBC and Comedy Central. It was only around the time of Obama’s reelection, better known as the final season of Breaking Bad, that the TV debate became unrecognizable, transformed into dueling episode recaps and debates over which shows deserved temple status in an age when television is praised in language once reserved for Periclean Athens.

Television’s opponents had always hoped the medium would get worse—so bad, in fact, that a critical mass of people would reject it and then, just maybe, the system it propped up and propagated so effectively. But it didn’t get worse; in the new century, in some cases, it’s true that the entertainment got better. The best of the new shows, beginning with The Wire, did indeed contain “novelistic detail.” Comcast and Viacom sponsored nightly left-ish satire and commentary. Anyone who ever waited hours in line at the local art house for Spike and Mike’s Sick and Twisted Festival of Animation could only marvel at the existence of Adult Swim. But the deep case against television was never about the content. In the sunrise of the new “golden age,” like a siren call from afar, the critiques of television developed over the years could still be heard: critiques about the design, impacts, and interrelationship of the medium’s form, content, and political economy.

These critiques had been gaining audience and urgency for decades. They served an important function. Then a fat and brooding mafioso from Staten Island sat on them, and the left forgot where it came from, in the flick of a switch.

Hot and Unbothered

Compared to the moral panics that historically greet new media, television enjoyed a smooth reception. It made a few high-profile cameos in anxious tracts of postwar social science and philosophy, but was largely ignored for decades. As trenchant an observer as C. Wright Mills all but dismissed TV, writing in 1953, “All of the ugly clamor of the radio, which has now been visualized on television, has become so much a part of the texture of our daily life that we do not truly experience it any more.” Dwight MacDonald, perhaps the critic best-suited by temperament to fit the crypto-reactionary caricature of the scolding anti-TV elitist, was a movie critic for the Today show. Paul Goodman took the technology most seriously, seeing the box as the shape-appropriate symbolic cornerstone of a conformist consumer society. The world of television, wrote Goodman in 1963, is “a peculiarly pure product of our public policy of an expanding economy with artificial demand (plus annual increases in the arms budget), to maintain both high profits and adequate employment. It has the aesthetics and human values that fit that policy.”

The Beats intuited that television was somehow central to the functioning of the American Moloch— “America this is the impression I get from looking at the television set /America is this correct?” asked Allen Ginsberg in 1956—but the subculture’s position on TV was mostly one of recumbence. It was Hollywood that provided some of the first pointed attacks on its upstart rival for America’s attention. The Man in the Grey Flannel Suit depicts the living room TV as a self-aware episode of The Twilight Zone might have done—as a hypnotizing, violent, undifferentiated disturbance. “Kick that television in,” says Frederic March. “Kick it in and stomp it if it gets in the way of the family.”

The 1960s appeared to offer ripe conditions for the rise of a mature anti-TV politics. Strangely, television got something of a pass during the decade’s youth-led great refusal. Despite the supposedly formative influence of Herbert Marcuse, who described TV as a prime enemy of oppositional thought in his 1964 tract of Marxist negation, One-Dimensional Man, the generation’s activists, weaned on TV, saw the medium as quasi-heroic. It brought down Joe McCarthy, assisted the rise of the civil rights movement into national consciousness, and seemed to be an inadvertent ally in turning the country against the Vietnam War. They also didn’t mind the attention of cameras. Among the indelible chants of a generation known for its narcissism was “The Whole World is Watching.”

Global Village Idiot

Then there was the retarding influence of Marshall McLuhan, the Canadian celebrity academic who served as a hype-man for television throughout the decade. McLuhan famously believed sitting in front of television made people active participants in a shared drama, one told over the giant campfire of a global electronic village. He didn’t care much about this drama’s content or its sponsors, or that everyone in the audience was sitting alone in a dark room staring at a box, subjected to a stream of manipulative ads. McLuhan thought television’s visual immediacy, its “coolness,” made it inherently revolutionary.

Though McLuhan denied ever doing LSD—he called it the lazy man’s Finnegans Wake—he wrote like an acidhead about cosmic consciousness, and planetary vibrations. This trippy packaging of imputed televisual virtue resonated, as it was clearly designed to, with the counterculture. Indeed, many veteran hippies followed McLuhan into the 1970s, when he transposed his rhetoric about “universal understanding and unity” from TV onto computers, providing a readymade script for internet utopians on the make like McLuhan’s disciple Stewart Brand.

McLuhan could only delay the inevitable reckoning with TV. On the eve of Richard Nixon’s election, the novelist and screenwriter Harlan Ellison launched a column for the Los Angeles Free Press, the country’s underground newspaper of record, called “The Glass Teat.” In each column, which ran until 1972, Ellison performed vivisections on prime-time programming to demonstrate the many ways television was no innocent technology, but a weapon in service of the System. “They’ve taken the most incredibly potent medium of imparting information the world has ever known, and they’ve turned it against you,” Ellison wrote in the inaugural column. “To burn out your brains. To lull you with pretty pictures. To convince you that throwing garbage in the rivers after your picnic is okay, as long as the factories can do it . . . To convince you that Viet Nam is a ‘struggle for Democracy.’”

Can there be a boob tube without a tube?

As good as Ellison could be, his critique was limited by his obsession with content, and his blistering reviews of popular shows often sounded like the outbursts of a disappointed lover, a hipper version of the guy who spends all night in his recliner with a remote in hand, screaming, “I can’t believe they make me watch this crap!” Rarely did his writing cut as swift or deep as the anti-TV songs that began to appear in the first Nixon years. In 1970, Gil Scott-Heron released the legendary proto-rap track, “The Revolution Will Not Be Televised,” which was soon followed by The Mothers of Invention’s “I’m the Slime,” in which Frank Zappa intones, “You will obey me while I lead you / And eat the garbage that I feed you. . . Have you guessed me yet? / I’m the slime oozin’ out from your TV set.”

It’s pleasing to imagine the sociologist Herbert Gans listening to Scott-Heron as he penned an essay in 1972 for the American Journal of Sociology decrying a “drastic famine” of television research. The only people with any interest in studying TV as a medium, Gans marveled, worked for the media companies and their advertisers. His fellow professors, he surmised, may be too smitten with their televisions to cast a critical eye. But that was hardly an excuse. That same year, Nielsen reported TV was nearing a saturation point: 96 percent of U.S. households had at least one television running six hours a day—and eight hours in homes with children. There was a lot of overdue work waiting to be done.

Unreason and Counterrevolution

Activists, meanwhile, had begun to survey a post-sixties landscape dominated by a capitalist system that had absorbed and deflected its radical opposition. The odd television sitcom by Norman Lear was what passed for progress, even as the same Fortune 500 companies continued selling the same consumption-based economy and worldview as they did during Howdy Doody. Only now they marketed their obsolescent wares by mimicking the style and language of a commercialized counterculture.

For some who once sat at McLuhan’s feet, this realization chimed fortuitously in unison with the publication of the first full English translations of Antonio Gramsci’s prison writings. As the imprisoned leader of the Italian Communists during the 1920s and 1930s, Gramsci developed the idea of “hegemony” to explain how class domination and ruling ideologies are reinforced and propagated, often subtly. Unknown thousands of newly opened eyes slowly turned to the flickering box in the corner. The whole time, the calls had been coming from inside the house. Activists and academics influenced by Gramsci, notably Todd Gitlin, began looking anew at television as a vector and enforcer of ideology. The more they looked, the more Gramsci’s ideas functioned like the sunglasses in John Carpenter’s 1988 anti-TV classic, They Live! Those groovy Coke commercial hippies harmonizing their desire to grow apple trees and honey bees? They’re aliens. TV Guide? It’s a fucking cookbook.

Colossal Youth

The cultural turn against television that followed in the 1970s and 1980s was not a movement of people who read Gramsci, or even Gitlin. The twentieth-century trope of the anti-TV snob—the insufferably pompous pipe-sucking virtue signaler who brags about not owning a TV set at every opportunity—seemed designed to deflect attention from the true face of TV rejection. Adolescents forming their politics in 1977 or 1997 didn’t see Kill Your TV stickers on BMW bumpers (though perhaps on the odd beat-up Volvo). They saw them on Peavey guitar amps, skateboards, and bass drums. As a target for adolescent rebellion, television was almost too perfect: It was the totem of a dull and passive domestic life and the lobotomizing tool of the state and corporate tastemakers. What’s not to hate, reject and smash?

For decades, depictions of television as a mind-numbing, soul-curdling technology were recurring motifs in both radical political circles and underground music scenes, especially but not exclusively punk, hardcore and political hip hop. The English kids who blasted The Normal’s “T.V.O.D.” in council houses during the late ’70s were nobody’s cartoon snobs. It was an army of disaffected latch-key Reagan youth, not an alliance of Ivy League English departments, that built a cult pedestal under Repo Man, a film premised on teenager rebellion and escape from TV zombie Boomer parents. (It is part of the film’s lore that an early version of the script rejected by the studio included a scene in which Otto burns down his house while his parents zone out to their televangelist.) The working-class macho of early Black Flag, whose ironic ode to prime-time, “TV Party,” anchored Repo Man’s soundtrack, articulated the disdain of an entire ecosystem of independent record labels and publishers. As a teenager in the early and mid ’90s, my politics were forged in part by the music around me, songs like Gorilla Biscuits’ “Stand Still” (“It steals my time and wastes what I’ve learned / I’m holding out for a better deal, for something real”); Public Enemy’s “She Watch Channel Zero?!” (“I’mma take your set and I’mma throw it out the window / . . . her brain’s been trained by a 24-inch remote”); and Rage Against the Machine’s “Bullet in the Head” (“Corporations cold turn ya to stone before ya realize / They load the clip in, omnicolor / Said they pack the nine, they fire it at prime time / The sleeping gas, every home was like Alcatraz”).

The loud hatred of TV was instinctive and thoughtful, necessary and healthy. It also served as a portal for a lot of adolescents to the next stage of their political education. The world is a scarier place without it.

Six TVs, stacked three by two, display distorted red and blue figures.
© Derrick Schultz

This portal was still young when the first great public TV burning was masterminded, not by German émigré philosophers who hated jazz, but by a freeform Bay Area art troupe called Ant Farm. On July 4, 1975, years after the first bra burning, the collective celebrated an “alternative Bicentennial” by driving a 1959 Cadillac Convertible at high speed through a flaming pyramid of 45 televisions in the Cow Palace parking lot. They called it a “Media Burn.” Unlike the arcane and endlessly debatable meanings of today’s art and media spectacles, its message was straightforward: Fuck television—destroy it. But even the literal burning of a mountain of televisions proved difficult to decode for the confused local news scrum in attendance. To spell things out, an Ant Farm member delivered a speech in the voice and cadence of John F. Kennedy. “Television can only produce autocratic political forms, hierarchies, and hopeless alienation,” he said:

Mass media monopolies control people by their control of information. In our vast society, it is virtually impossible to escape the influence of commercial advertising. And who can deny that we are a nation addicted to television and the constant flow of media? . . . Now, I ask you, my fellow Americans, haven’t you ever wanted to put your foot through your television screen?

Cathode-tonia

Two years after the Ant Farm’s act of pyrotechnic propaganda, the journalist Marie Winn published the first critical investigation of television as a medium. Because so little research existed on the subject, The Plug-In Drug relied mostly on anecdote, interviews and conjecture. But its conclusions—that television was addictive, increased aggression, slowed cognitive development, decreased test scores, weakened families and enervated the innate human ability to play, to amuse ourselves, to sit still and fill our minds with our own thoughts—resonated with a national audience who suspected she was right, even if no one had ever made the case. The book was a bestseller.

A year later, Jerry Mander’s Four Arguments for the Elimination of Television offered a deeper, more radical critique, correcting what Mander called Winn’s failure to apply her findings “to the power drives of the wider society.” These were drives Mander understood well. He wrote as a repentant enabler of the TV plague after fifteen years as an ad and public relations man, including five years as the president of the San Francisco advertising agency Freeman, Mander & Gossage. His own Montag-like snap came during a 1968 cruise through the Dalmatian Straits, when he attempted to appreciate a brilliant natural scene before him—“rocky cliffs, rolling seas, dazzling sky, and colors as bright as a desert”—and realized he couldn’t. “I felt nothing,” he writes. “Something had gone wrong with me.” That something was television.

Corporations quickly learned to control this new world.

Mander returned to San Francisco, dissolved his agency and opened the country’s first marketing firm for nonprofits, Public Interest Communications. Working on environmental and human rights campaigns, he noticed his clients were basing strategies entirely around garnering maximum television coverage. The splintered movements of the 1970s were “media based” and “derivative of the needs of” nightly news segments that “determined the style and content (or lack thereof) of all political action.” The logic of TV encouraged spectacle for its own sake, making it difficult to explore complexities or connect issues—pollution, alienation, war, poverty—into a coherent worldview and struggle.

“Overall understanding of the forces that were moving society seemed to be diminishing,” Mander wrote. A televised “glut of information was dulling awareness” and encouraging passivity and confusion, not clarity and involvement. The revolution, in other words, would not be televised.

Form Obliterates Function

The thesis of Four Arguments is built upon the rubble of an unflinching assault on Marshall McLuhan—one that is both entertaining and necessary. While McLuhan’s biggest ideas were important and sound—a society’s technologies determine the nature of its thoughts and conversations; modern communications make a global culture possible—much of what he wrote was nonsensical, wrong, or reflected a disqualifying lack of interest in the political economy of modern media. [2]

Mander marshaled his insider’s knowledge of television to turn McLuhan on his head. The medium was not just the message; it was the problem, and an irreformable one. He argued that television was best suited to advertising and propaganda, that it destroys the capacity for sustained and critical thought, shatters historical perspective, and stunts the individual’s ability to feel or experience deeply. “Television offers neither rest nor stimulation,” he writes. “Television inhibits your ability to think, but it does not lead to freedom of mind, relaxation, or renewal.”

Even when it tries to deal with oppositional subjects and themes, such as non-materialist values and worldviews, or the systemic causes of inequality and environment destruction, television shoehorns them into deforming narrative tropes and splices them into oblivion. Words and images can only be forgotten as they flow into the next commercial, the next show, ceasing only with the orchestral version of the national anthem that for decades brought broadcasting on the networks to a temporary, staticky rest.

Mander framed Four Arguments around the planet’s deepening ecological emergency. The reigning paradigm of consumption and growth could not be questioned, let alone disrupted, he argued, so long as television was setting the terms of our thoughts and debates, not to mention limiting our very ability to think and debate. Four decades later, Mander’s arguments hold up all too well.

Binging Without Purging

A sustained burst of anti-TV books followed the salvos by Winn and Mander. In his 1979 broadside against the terminal self-involvement of postwar American life, The Culture of Narcissism, Christopher Lasch referenced television often. The following year, Vance Packard updated his Eisenhower-era expose of television advertising, The Hidden Persuaders. In 1985, Neil Postman’s Amusing Ourselves to Death warned that the West was entering the screen-and-pill entertainment coma described in Huxley’s Brave New World.[3] In 1986, Fairness and Accuracy in Reporting began publishing its newsletter, Extra!, which continues to valiantly catalogue every new reason to avoid commercial broadcast news. In 1988, Noam Chomsky and Edward Herman published Manufacturing Consent, a narrow but damning study of the structural biases of the major networks’ news departments. [4]

In 1992, Bill McKibben’s The Age of Missing Information dug into the link between television and humanity’s accelerating glide-path toward environmental ruin. McKibben’s previous book, The End of Nature, had laid out the new science of climate change. Missing Information explored how TV made the changes mandated by the grim findings of environmental science difficult, if not impossible, to realize. The book was structured as a kind of diary documenting McKibben’s viewing of well over one thousand hours of cable television—nearly every minute of programming on 93 channels during a 24-hour period—and then comparing the experience with sitting on a remote mountaintop in the Adirondacks observing nature and contemplating humanity’s place within it. The result is a personal account of how the chaotic and never-ending “flow” of television cracks and cheapens our experience of the world, destroying our ability to form coherent narratives about what we are, and should be, doing here. Even after we shut the TV off, we are unable to process the information we most need to educate ourselves about “the physical limits of a finite world. About sufficiency and need, about proper scale and real time, about the sensual pleasure of exertion and exposure to the elements, about the need for community and for solid, real skills. About the good life as it appears on TV, and about other, perhaps better lives.”

The Off Position

Around the time these words appeared, two twenty-something D.C. housemates named Henry Labalme and Matt Pawa were having late-night conversations along similar lines. They decided that saving the planet meant killing TV, and that killing TV required more than just publishing another book. In 1994, Labalme and Pawa launched TV-Free America with the goal of cutting household viewing time in half within a decade.

TV-Free America, whose advisory board members included Marie Winn and Jerry Mander, made no mention of hegemony, capitalism, manufactured desires and consent, or environmental tipping points. The group focused instead on the personal and social benefits of reducing exposure to the “passive, sedentary and non-experiential” medium. As Labalme explained in 1997, the intention was “not to beat people over the head with this idea that TV is rotting their brain, that it’s destroying their communities, but to say, try life with a little less TV and a little more time, and you’ll have more fun.”

The loud hatred of TV was instinctive and thoughtful, necessary and healthy.

This moderate approach helped build “TV Turn Off Week” into a mainstream global phenomenon. Between 1995 and 2000, the organization’s signature event signed up more than 50,000 partners worldwide, mostly schools, social clubs, community organizations, and religious groups. It racked up endorsements from twenty-five governors and Bill Clinton’s surgeon general. Even Pope John Paul II got in on it, calling on Catholics to give up TV for Lent. Eight years after its founding, the group reported sister organizations active in more than twenty countries. Some of these broke from TV Free’s moderate style. Kalle Lasn’s Vancouver-based Adbusters Media Foundation attacked television with biting neo-Situationist riffs on social atomization, ecocide and consumerism. (The group’s subsequent devolution into sneaker sales and TV “subtervising” is another story.) In the UK, a boisterous group called White Dot organized community TV turnoffs and marketed TV jamming technology and jewelry allegedly made from the detritus of televisions smashed by the Taliban in the public squares of Afghanistan. The group’s founder and leader, David Burke, believed any further theorizing only made the problem worse. “If the ‘off’ button is the answer,” read the White Dot manifesto, “then no media studies course will ever help students find it. . . . all media studies can only chase its tail.”

The movement spawned by TV Free America obviously failed to reduce TV consumption, or slow the proliferation of screens in homes and public spaces. What it did do was reinforce a fragile line of defense against the total normalization of a corporatized, image-based culture unable to see itself clearly, let alone formulate alternatives. Kill Your TV was always more than an admonition to get some air, or a symbol of belonging to this or that subculture. It was an entry point. Smashing a TV is symbolic of a bigger and never more necessary rejection, one that suggests the possibility of conversations and politics liberated from the boundaries of the corporate screen.

Not All There

Labalme and Pawa’s ideas and activism from the not-so-distant past have met the same treatment from today’s critical consensusphere as Bradbury’s novel did in the hands of HBO. If that new consensus has a spokesperson, it’s Emily Nussbaum, the New Yorker TV critic who this year released a book of essays, I Like to Watch. That Nussbaum sees herself as running a victory lap for television, a Pulitzer draped around her neck, is clear from the collection’s title, taken from Jerzy Kosinski’s 1971 satirical anti-TV novel, Being There. Long associated with the book’s simpleton-antihero, Nussbaum has reclaimed “I like to watch” with something between a wink and a sneer. But the cleverness doesn’t really survive inspection. Of all the anti-TV tropes Nussbaum might have chosen to smugly flip in 2019, Kosinski’s prediction of a catch-phrase spitting manchild celebrity whose TV-roasted brain carries him to the White House is an odd choice.

The title is only the first stumble in Nussbaum’s victory lap. I Like to Watch opens with a version of the TV wars as a fairy tale about the shifting “status” of the medium in the cultural pecking order. (And by extension, the status of writers like Nussbaum.) In Nussbaum’s telling, she once looked down on television just like everyone else, influenced by the “drunken cultural brawl” over television’s “crappy reputation.” Despite perfunctory mentions of Winn and Mander, it’s clear she never engaged with the radical core of the anti-TV position, but stood with arms crossed in the art-versus-entertainment sideshow tent. Still, she remembers the world before the universal embrace of television as a fraught place, full of shameful urges, strange ideas, and wonderous if grotesque creatures. “People still referred to television, with no irony, as ‘the boob tube,’ and the ‘idiot box’,” she writes of the late 1990s, as if describing a flying lizard that survived an extinction event. “This was the value system that I was soaking in, Palmolive-style.”

Nussbaum is proof that you can soak in something without absorbing it. The acclaimed critic still understands arguments about TV as a matter of determining what you can say out loud at parties, and not of ideology, commercialism, propaganda, ecology, human health, and, ultimately—as once suggested in a book by her one-time New Yorker colleague Bill McKibben—the survival of the species.

Reading Nussbaum’s chronicle of the TV wars—a redemption story about how “a sparkling multiverse of cable channels” released humanity from the “status anxiety” of liking to watch—it’s easy to envision her as a graduate student growing bored with tedious Victorian novels—the PhD track she says she abandoned after a revelatory episode of Buffy the Vampire Slayer—and starting to perform mental math calculations to track her favorite shows’ placement, among all other competitors for her attention, on some kind of status-compass grid. In 2004, she would take this skill public in New York magazine with the creation of “The Approval Matrix,” the hot-or-not culture atlas whose trademark four quadrants neatly explain Nussbaum’s lack of interest in Mander’s four arguments.

Big Brother is You, Watching

It’s hard to imagine now, but the rise of on-demand and mobile platforms for TV content did cause a panic among those with stakes in the old television order. Beginning in 2010, viewing time in U.S. households started to dip for the first time since Nielsen began keeping track. It seemed possible that the “fourth screen” in our pockets—with its expanding universe of apps and streaming services—might weaken television’s long deathlike grip on our national attention, if only by replacing a mother source of commercial distraction with a million little ones. [5]

Alas, the panic was short-lived. Corporations quickly learned to control this new world and accomplish even deeper levels of “brand penetration” on a terminally distracted, screen-addled public. In 2013, a team at Innerscope Research, a marketing firm, published an article announcing the good news in the Journal of Advertising Research. The piece had a lurid Huxleyan title that no doubt would have caused Neil Postman, who died in 2003, to break out into a knowing grin: “Leveraging Synergy and Emotion In a Multi-Platform World: A Neuroscience-Informed Model of Engagement.” The researchers explained that, rather than threatening the power of television, a “flexible media environment” can actually facilitate “activation of brand associations from previous exposure across platforms.” The Clockwork Orange–style pupillometrics experiments of the 1960s look downright Socratic next to the fMRI tube experiments being conducted by Nielsen’s screening-room laboratories across the country, where brain scientists on the bleeding edge of the new TV-internet paradigm target precise coordinates in our lateral prefrontal cortex to facilitate brand memory encoding. (Nussbaum does not have much to say about any of this in her celebrated TV book. In a chapter on the rise of “integrated sponsorship” that puts brands and products at the center of scripts and story lines, she boldly concludes, “If Tina Fey thinks it’s okay, who am I to disagree?”)

In 2015, Nielsen, a $6 billion company, purchased Innerscope Research and installed its founder and CEO, Dr. Carl D. Marci, as the parent company’s Chief Neuroscientist. A fixture on the TV-internet branding circuit, Marci is celebrated as a visionary at events like South by Southwest and the TV of Tomorrow conference, where the future is an ever-growing constellation of screens.

It’s not surprising that Marci and his peers in the fields of social and consumer neuroscience are received as heroes in this world. What should be surprising, and deeply depressing, is the disappearance of the world where guys like Marci are seen as villains who sometimes end up on the wrong end of a flamethrower. Our imaginations can no longer absorb a narrative in which Guy Montag is the hero, television the enemy, and his wife the tragic warning. The portals to this world were never on center stage, but they existed, in newsstands and zine racks, in books, in leaflets, in the boomboxes of high school parking lots. HBO, which has done more than most to make this world vanish, did nothing brave with its production of Fahrenheit 451. By scrubbing TV from Bradbury’s story, it papered over its own pixelated reflection and ours, sparing us from reckoning with the fact that we are all Mildred Montag now.

 


[1] Truffaut’s 1966 movie goes further, anticipating and satirizing the psychology of social media validation by having the actors turn to the camera and tell Mildred, with rote enthusiasm, “You’re absolutely fantastic!”

[2] Anticipating his acolyte Jean Baudrillard, McLuhan treated all knowledge as undifferentiated “data,” once describing the explosion of an atom bomb as “pure information.” But McLuhan didn’t think immersion in TV’s chaotic image-scape resulted in apathy, confusion, and the degradation and ultimate disappearance of meaning. He thought it did the opposite, elevating and energizing viewers into meaningful dialogue and action. This misplaced faith in TV could lead him into idiocies such as the claim, made in Understanding Media, that TV “has ended the consumer phase of American culture.” With McLuhan it’s not clear if this is cause for celebration or lament. In the same book, McLuhan extols advertisements for “represent[ing] the toil, attention, testing, wit, art, and skill of . . . highly skilled and perceptive teams,” the products of which are “magnificent accumulations of material about the shared experience and feelings of the entire community.” Ads weren’t directed at the viewing public, they emanated from it.

[3] Many of Postman’s points were made better and earlier in Four Arguments, but Postman acknowledges Mander by name only once, in a pissy dismissal of “preposterous notions such as the straight Luddite position outlined in Four Arguments Against Television.” Postman may have been triggered by Mander’s assault on McLuhan, and goes out of his way to huff at the “fashionable disavowal” of McLuhan by scholars who “were it not for [him], would today be mute.”

[4] There was a countervailing trend during this same period. The “populist” turn in media studies during the 1980s led to an academic cottage industry committed to understanding television, not as an ideological enforcer and enemy of thought, but as a site of audience agency, engagement and “resistance,” where the only thing worth criticizing was a lack of representation.

[5] The mobile internet wasn’t the first threat to television as a top-down, one-way advertising and consciousness-colonizing medium. As Tim Wu recounts in The Attention Merchants, Eugene Polley’s invention of the remote control in 1955—first produced in the shape of a gun, to “shoot out” the ads—was heralded as an empowering, even liberating technology that would end television as we knew it. But commercial TV survived and flourished, just as it has survived the internet, TiVo, social media, and streaming.