Skip to content

What Are Intellectuals Good For?

We have only words against

Do intellectuals matter? In this age and country, there’s room for doubt. Certainly we haven’t diffused general enlightenment, which is our job. Among the countless examples of American un-enlightenment I’ve seen reported in recent years: 50 percent of Americans told pollsters that the earth has been visited by UFOs, and nearly all of them also believed that the U.S. government has covered up this fact. Forty percent did not know whom the United States fought in World War II. Six percent reported reading one or more books a year.

Majorities of Americans believed that Saddam Hussein helped al-Qaeda carry out the atrocities of 9/11 and that weapons of mass destruction were found in Iraq after the U.S. invasion. In a recent NPR poll, a majority of Americans either agreed or were not sure whether “a group of Satan-worshiping elites who run a child sex ring are trying to control our politics and media.” The pandemic spawned its own delusions: early in 2020 approximately one-third of Americans believed that scientists had created and disseminated the coronavirus; a few months later, another third believed that the virus was sent by God to teach humanity a lesson. On the day of the Capitol Hill riot, 39 percent of Americans believed the 2020 election had been stolen, for no better reason than that President Trump said so. To all such people, the American intelligentsia has been of very little use. We may as well have been publishing our books, essays, and op-eds on Mars.

In another sense, however, we matter too much. A loathing for intellectuals was almost a defining characteristic of Trump’s base. At one point in the 2016 campaign, Trump told a crowd gleefully: “I love poorly educated people!” He didn’t love them enough, apparently, to offer them more than a few crumbs in his huge, one-percent-friendly tax cut the following year. But did intellectuals succeed in pointing out that hypocrisy to the poorly educated? Did we try?

It’s not entirely our fault, perhaps. If we had tried, we would have encountered a profound mistrust of intellectuals, skillfully cultivated by generations of Republican political strategists. “Eggheads,” “pointy-heads,” “New Class,” “silent majority,” “real Americans,” “feminazis,” “baby-killers,” “sushi-eating,” “latte-drinking”—with these and many other tropes, Republican politicians and their operatives and media surrogates, from Paul Weyrich to Lee Atwater to Frank Luntz to Karl Rove to Rush Limbaugh and Sean Hannity have planted in less-educated voters not a healthy and discriminating skepticism toward experts but a belligerent and preemptive rejection of all complexity, leaving them vulnerable in turn to Republicans’ kindergarten-level ideas about supply-side economics, abortion, immigration, race, evolution, and climate change.

Whatever blend of liberal arrogance and conservative chicanery is to blame, the gulf between intellectuals and our fellow citizens is very wide.

What’s more, we wily intellectuals allegedly have designs on everything our stalwart countrymen hold dear. In concert with power-hungry liberal politicians, we are planning to introduce radical innovations in every sphere of social life: childcare, schooling, zoning and city planning, law enforcement, marriage, and religious liberties, trampling on long-settled customs and traditional understandings, until ordinary Americans no longer recognize their country. As a sales campaign, this has been fantastically successful. And as always with successful propaganda, there’s a grain of truth in it. Intellectuals and liberal politicians have rarely taken seriously enough their democratic obligation to persuade people before legislating for them. (Of course, this may partly be because politicians now have no time to talk to voters: they must spend 50 percent of every day raising money, an entirely predictable result of Citizens United and other Supreme Court decisions that have enshrined money as the arbiter of American politics.)

Whatever blend of liberal arrogance and conservative chicanery is to blame, the gulf between intellectuals and our fellow citizens is very wide. What’s more, political propaganda and campaign finance laws are not even the most important obstacles to a democratic culture. They are, so to speak, contingent obstacles; there are other, more fundamental ones that arise from the very structure of ownership in this society. Marx observed: “In every age, the ideas of the rulers are the ruling ideas.” He did not mean, of course (it is generally necessary in the United States, when discussing Marx, to begin by explaining that he did not mean what he is usually taken to mean), that capitalists go into the marketplace to buy young intellectuals, like young slaves or young peasant girls, whom they then train up to service; nor that intellectuals, once established, offer themselves in the marketplace to the highest bidder; and certainly not that the ideas of the rulers are usually the best and most persuasive ones. He meant that, since the rich get the social and economic arrangements they want in virtually every society, and since legitimation is an essential part of accomplishing this, and since intellectuals are the primary agents of legitimation, the rich will take care that intellectuals and the institutions in which they operate—most of them, anyway; uniformity looks bad, so a certain amount of dissent is tolerated—foster the right ideas. A.J. Liebling agreed with Marx, about this if nothing else: “Freedom of the press is guaranteed only to those who own one.”[*] Liebling and Marx are pointing out the obvious: who pays the piper calls the tune. To the extent we believe this, we are historical materialists. To the extent we disbelieve it, we are naive indeed.

Many intellectuals and journalists nonetheless do disbelieve it, insisting that “no one tells me what to write.” Very true; ideological control is much subtler in capitalist societies than it was in communist ones. Usually, in fact, it is not overt control at all; that is, not one person or group laying down the law for another. It is, rather, circumstantial or structural control, a matter of the constraints imposed simply by living in a minimally regulated market society.

Imagine a society in which intellectuals are free to write anything they want but it is forbidden to sell magazines or books. Under these peculiar circumstances, intellectuals would technically be free, but their freedom wouldn’t be worth much. Now imagine a society in which intellectuals are still free but the overwhelming majority of the society’s members—their intended readers, who desperately need the truths the intellectuals have to offer—are tired and stressed, have very little spare money for books or free time to read, are continually distracted by gaudy and often sexualized advertisements in every medium, did not receive a high-quality education, and have internalized the society’s dominant ethic of competitive individualism rather than cooperative solidarity. These are not, unfortunately, peculiar circumstances but pretty much the way things are in the United States and have been for the last forty years. Under these circumstances the freedom of intellectuals is, again, not worth much.

How did things get this way? The rise of the New Right, funded by corporations, foundations, and wealthy individuals, guided by political consultants and neoconservative intellectuals, and channeled by the Republican Party, is a familiar story. Highlights include: the destruction of labor unions (accomplished by appointing antilabor lawyers and business executives to the National Labor Relations Board, where they ignored labor-law violations or delayed addressing them for so long that the organizing drive in question simply died); the sabotage of Hillarycare, the attempted sabotage of Obamacare, and other unflagging Republican efforts to prevent tens of millions of Americans from having health insurance; NAFTA and financial deregulation (bipartisan efforts, the Democratic Party having turned sharply rightward); a massive shift of the tax burden away from the rich and toward the non-rich (accomplished by three large and lop- sided tax cuts, in the Reagan, Bush II, and Trump presidencies, as well as by policy directives to the IRS to audit more taxpayers from the bottom half of the income distribution and fewer from the top 1 percent); and an all-out Republican assault on government, including constant efforts to privatize education, prisons, war-fighting, Medicare, the Post Office, and Social Security. This is how one produces an insecure, atomized, and resentful populace with a short attention span.

Along with these obstacles on the receiving end, intellectuals face difficulties on the delivery end. Newspapers and even television once functioned more or less as public utilities. No longer. Media are big business. Concentration and centralization are the rule in a capitalist economy, as companies pursue tax advantages, market power, and organizational synergies. With expansion comes debt, and with debt comes pressure to cut costs and stabilize revenues. This has regularly meant, as New York Times editor Max Frankel once wrote in frustration, “more sex, sports, violence, and comedy,” while “slighting, if not altogether ignoring, news of [serious subjects].” And conglomeration often means eliminating family ownership, which has at least occasionally allowed noncommercial values some scope within media organizations.

The new owners may have conservative opinions, as moneyed people often do, but whatever their opinions (if any), they are powerless to impose them on an institution that ultimately answers to the market. The institution will adopt a point of view—usually the conventional wisdom—least likely to upset the average reader/viewer and most likely to put him/her in a receptive frame of mind toward the upcoming commercials, which, for the newspaper or magazine or station, are what really matter.

The conventional wisdom is sometimes right and sometimes wrong. But it is always—by definition—easier to state than a critique of the conventional wisdom. It is simply what everyone knows: for example, that raising the minimum wage increases unemployment; that governments, like households, must balance their budgets; that the private sector is always more efficient than the public sector; that the United States promotes freedom and democracy throughout the world; that the truth generally lies between the “extremes” of left and right. The sources of conventional wisdom, in any society, are those in authority: state agencies or administrators, business managers or their spokesmen, and accredited experts—the latter are those who have undergone professional or academic socialization and have not forfeited their credibility by too pronounced an opposition to the conventional wisdom.

In most situations, editors, publishers, and producers will default to the conventional wisdom, for two main reasons. First, it is very much cheaper to source. Government and business both run colossal propaganda operations, which helpfully supply reports, research summaries, informational films, and other materials presenting their point of view, often even before they’re asked. Those willing to accept the official perspective (either public or corporate) find their work already done for them. Those who aren’t willing must do a lot of extra work, often involving extra expense.

Another, probably more important, reason for hewing to the conventional wisdom is that the penalties for departing from it can be severe. Those same friendly government and business propaganda outfits stand ready to contest every fact and interpretation in a critical story, and sometimes to sue, even on frivolous grounds. Media executives don’t want this kind of grief, as they make very clear to editors and producers. For all these reasons, wide-ranging, properly antagonistic investigative reporting, which public intellectuals cannot do without, is an endangered species in America.

All this pressure toward conformity, notice, has been produced without anyone telling anyone else what to write. That does happen, to be sure: New York Times executive editor A. M. Rosenthal tilted the paper’s coverage of Central America and the Middle East rightward during the 1970s and 1980s, in accordance with his own neoconservative views; and William Sarnoff, chairman of Warner Books, personally intervened to suppress publication of the first monograph edition of Noam Chomsky and Edward Herman’s The Political Economy of Human Rights. But for the most part, neither censorship nor any other kind of coercion is necessary. The ideas of the rulers are transmuted into the ruling ideas smoothly and frictionlessly, by a series of buffers, barriers, and pre-settings, shepherding us toward safe opinions or, if we persist in inconveniently radical opinions, shunting us toward the cul-de-sac of publication at the margins of public conversation, isolated with like-minded eccentrics.


Why write, then, if failure and frustration are virtually inevitable? Underneath the usual reasons—vanity, righteous indignation, a simple pleasure in fashioning sentences—I believe there’s usually gratitude. From admired writers we’ve received a gift that we’re eager to pass on. They model probity, fearlessness, tact; they make the intellectual virtues irresistible and their exercise compelling. To impart to even a few readers my intense and complicated affection for Serge and Orwell and Pasolini, Trilling and Illich and I. F. Stone, seems a duty both to them and to those readers. To help install figures like these in our culture’s permanent memory is one responsibility of us lesser intellectuals. (And to let a little air out of the reputations of William F. Buckley and Irving Kristol is also worthwhile, and very satisfying.)

T. S. Eliot observed that “Dante and Shakespeare divide the world between them; there is no third.” If I had to choose the exemplary public intellectual of my generation (or spanning my generation), I would say Noam Chomsky, and I might very well add: “There is no second.” Certainly no one else approaches his preternatural rigor or dialectical virtuosity. One critic described Chomsky as “a logic machine with a well-developed moral imagination.” That’s good, but it leaves out the astonishing abundance of detail that makes his books an encyclopedic history of American depredations in Southeast Asia, Central America, and the Middle East over the last sixty years, as well as the (barely) restrained sarcasm, unshowy but lethal, that makes of his indignation a high style.

Symbolic politics has its claims, but it has occupied a disproportionate share of left/liberal attention for some time.

If there is one theme that unifies Chomsky’s vast corpus, it is moral universalism: the insistence that we apply to ourselves and our government the same moral standards we apply to others. This directly contradicts American exceptionalism: the belief, usually assumed rather than argued, that the United States is unique in contemporary, perhaps even world, history in acting abroad for selfless purposes, often at considerable sacrifice, in order to bestow or defend freedom, democracy, and prosperity. American exceptionalism is so commonplace that it is unusual to read a whole issue of the New York Times, the Washington Post, Time, Foreign Affairs, the Atlantic, the New York Review, or even The New Yorker without encountering some version of it. American policy always gets the benefit of the doubt, even when there is no doubt. The United States was “containing Soviet expansionism” after World War II, even though the left-wing movements in Greece, Italy, Guatemala, or Iran were indigenous and by no means Soviet creations, while in each of those countries the United States brought to power right-wing governments, all of them unpopular, and most of them harshly repressive. The United States was “defending South Vietnam,” though it knew perfectly well (and admitted in internal documents) that the insurgency it was bombing so unrestrainedly had the support of South Vietnam’s population. The United States invaded Iraq in order to “liberate” the country from the tyrant Saddam, although it had warmly supported the tyrant Saddam for a dozen very brutal years, until his fealty was no longer assured. Right through the Obama administration, much of the press and academic scholarship maintained their habits of deference to the conventional wisdom. Chomsky’s powerful criticisms and extraordinary public reach have provided a small but important skeptical counterweight.

Undoubtedly Eliot had third and fourth great authors up his sleeve, and I have many other keenly admired public intellectuals up mine: Ralph Nader (who in addition to founding and strategizing for dozens of citizens’ groups, lobbying tirelessly on Capitol Hill, and exhorting students at practically every law school in America to do something useful with their lives, has written two dozen books and countless newspaper columns), Barbara Ehrenreich, Thomas Frank, Thomas Geoghegan, and plenty of others: Serge, Silone, Orwell, Pasolini, Bourne, Stone, Macdonald, Howe, Rorty, Lasch. What they have in common with Chomsky, and with one another, is a combination of discrimination and democratic passion. Their discrimination—moral intelligence, really—allowed them to make relevant distinctions and get difficult decisions right. Mostly right, anyway: they made mistakes, like Macdonald’s pacifism in World War II and Howe’s harsh and ungenerous response to the New Left of the 1960s, and more consequentially, Nader’s failure to overlook the Democratic Party’s outrageous treatment of him throughout the 2000 presidential campaign and magnanimously bow out of the contested states. Howe and Macdonald did, though, get another, very important distinction right. The majority of their contemporaries went from uncritical support of the Soviet Union to uncritical support of the United States, unable to orient themselves in the political world without wholehearted partisanship. It is a very common disability, which is why the example of Macdonald and Howe, who kept their critical antennae pointed in all directions, was so useful. Nader’s immense usefulness was a result not so much of judgment as of energy and persistence. That our air and water are not even dirtier than they are, the atmosphere not more full of poisons and particulates, our product labels more misleading, and our regulatory agencies more beholden to the industries they’re supposed to regulate, is Nader’s doing more than anyone else’s.

By democratic passion I mean the constant remembrance that democracy entails not merely that the people should be governed well but also that the people should govern. For the last century, since the defeat of Populism—the most significant working-class movement in American history—there have been two broad factions in American politics: the business party and the Progressives. Unlike the former, the latter at least had an idea of the general welfare and acknowledged the need for some regulation of business. But they envisioned no role for most citizens except to vote every two or four years. Government should be left to experts, who would duly take note of the public’s biennial or quadrennial bleating.


Contemporary intellectuals—intellectuals in every age—also need plenty of discrimination. Here are some discriminations that seem to me worth making now, though others may feel they are too obvious to need stating. Although Yeats sympathized with Irish fascists, he was nevertheless the best English-language poet of the twentieth century. Although Eliot disdained working-class politics and made several anti-Semitic remarks, he was the next best. Although Lawrence briefly despaired of democracy and flirted with authoritarianism, he was a great novelist and a great spirit. Although Flannery O’Connor used the N-word and declared herself a “segregationist by taste,” she also declared herself (well in advance of most other Southerners) an “integrationist in principle” and, more important, wrote powerful fiction in which Blacks were fully imagined. Although Saul Bellow and Philip Roth were unpleasant to the women in their lives and their novels, they were master stylists and storytellers.

Cancel culture also calls for discrimination. Leon Wieseltier was a much-admired (for good reasons and bad) literary editor of The New Republic who behaved badly to his female subordinates. Certainly any future female subordinates should be protected from him (as all female subordinates everywhere should be protected). But that an ambitious and exciting new magazine he had organized should be disbanded and its first issue pulped when his transgressions were revealed—this panicky overreaction was a sin against culture. (Fortunately, he found a braver patron and returned with an even more successful magazine.) In my hometown (Cambridge, Massachusetts), a school committeewoman with a strongly progressive record was taking part in a public discussion of a proposed high school course on racist language in American history. At one point, referring to some material that used the N-word in full, she used the N-word in full, clearly referring to the word as part of the course material and not to any person, present or past. It could not have been a more obvious case of mention rather than use. Nevertheless, three days later, under pressure, she resigned from the school committee. Surely cancellation should be reserved for unrepentant mass murderers: Henry Kissinger, Dick Cheney, and Donald Rumsfeld, for example, all of whom went on to live serene and prosperous post-criminal lives.

Intellectuals should also be discriminating about where their energies go. Symbolic politics has its claims, but it has occupied a disproportionate share of left/liberal attention for some time. Increased minority and female representation in elite professions does not trickle down, after all, any more than tax cuts for the rich do. In the eastern United States, Black neighborhoods have on average 66 percent more air pollution than white ones. If leftists and liberals had paid as much attention to this disparity as they lavished on, say, the case of Rachel Dolezal, who believed that because she grew up with four Black siblings, married a Black man, taught Black history, and got herself elected president of her local NAACP chapter, she was entitled to call herself Black, the EPA might not have found it quite so easy last year to ignore pleas from scientists and community activists to lower permissible pollution levels. Which would have saved a lot of Black lives.

According to the Urban Institute, the median net worth of African American families is around $17,000 (the median for white families is several times higher). The 50 percent of Black households below the median could probably not meet a medical emergency or invest in their children’s education or buy a house without taking on crippling debt. Recently thousands of theater professionals signed a manifesto demanding that more than half of Broadway theaters be named after people of color and that more than half of all actors, writers, directors, and designers employed by theater companies be people of color. The National Book Critics Circle publicly apologized last year because only 30 percent of its annual book awards went to people of color and only 25 percent of its board are people of color, while the fact that Blacks make up only 5 percent of the publishing industry’s workforce was considered clear evidence of “institutionalized racism.” Is it possible that economically vulnerable Blacks would prefer to have more Black and white allies in their desperately unequal struggle for economic fairness than more Broadway theaters named for Blacks and more Black actors, directors, and book reviewers? Perhaps we should ask them.

One more example: several million girls (and roughly the same number of boys) in the developing world die each year from malnutrition, pneumonia, diarrhea, malaria, and other easily preventable diseases. A hundred thousand adult women suffer obstetric fistulas each year: easily repaired by a simple surgery but devastating if not repaired, and often not repaired for lack of facilities. Aren’t these horrors worth many orders of magnitude more Twitterstorms than, say, J. K. Rowling’s views on transgender rights—which, right or wrong, can hardly produce as much suffering as an obstetric fistula?


The responsibility of intellectuals has been a live topic since intellectuals came into existence in the eighteenth century. Clearly our responsibility is to écraser l’infâme, or, put more modestly, to lessen the monstrous injustice in the world at least a little. In the eighteenth century, l’infâme was superstition and the clerical power that imposed it. In the second half of the twentieth century it was, for American intellectuals, American power, which instigated or supported more than a dozen right-wing coups, resulting in the murder, torture, and imprisonment of millions and the economic exploitation of scores of millions. Since the comprehensive failure of American policy in Afghanistan, Iraq, Syria, and Libya, both elites and the public are wary of further interventions, which the United States cannot afford in any case. America’s grotesque economic inequality would seem like a natural candidate for the twenty-first century’s infâme, other things being equal. Unfortunately, other things are not equal—far from it.

To tilt at the state and capital or to ignore them: this has always been a choice for intellectuals.

For intellectuals and everyone else, one responsibility, I would say, now trumps all others. It’s not justice. The near-demented zeal of today’s Republican Party for further enriching the already rich is novel in degree, but plutocracy has been the rule in America since John Jay admonished his fellow Founding Fathers that “those who own the country ought to govern it.” Those of us who reject that ignoble creed have the memory of the New Deal and European social democracy to pit against it; and what is the current popularity of “socialism” among the young but a revulsion against the obscene inequality that disfigures twenty-first-century America? Of course we must defeat plutocracy. But it may have to wait.

I don’t mean nuclear disarmament, either. In the Nuclear Non-Proliferation Treaty of 1968—like the UN Charter, the most solemnly binding of legal instruments, however routinely disregarded by the Great Powers—the United States promised to gradually reduce its stockpile of nuclear weapons to zero. It has not done so, of course, and neither has any other signatory. The deadly, delusional logic of deterrence still prevails, despite many accidents and (at least) one false alarm that brought the Soviets to within minutes of a full response and the world to within minutes of unthinkable calamity. It is insane to expect that no accidental or deliberate use of nuclear weapons will ever occur—that our luck will hold forever. Activism to keep the danger in public view will always be necessary. But seventy-five years without an actual apocalypse has induced an almost insuperable mental inertia among Americans, intellectuals and ordinary citizens alike. Practically no one believes that nuclear catastrophe is really possible—or at any rate, likely enough to make it worthwhile to try to resurrect the international antinuclear movement of the 1980s.

It is likely that the human race would survive a full-scale nuclear war, in some form. We would probably also survive the results of burning fossil fuels at present rates indefinitely. The casualty level in both cases would probably be similar—in the hundreds of millions—though more drawn out in the case of global warming. The difference is that the earth is already burning. Earth’s average temperature has already increased by nearly 1.2° Celsius (roughly 2.5° Fahrenheit) since 1800. An increase of 6°C (approximately 11°F) would be, insofar as one can compare unimaginable things, equivalent to an all-out nuclear exchange.

But while the aftermath of nuclear war would see countless deaths from radiation and starvation, it is possible to imagine a gradual return to normalcy over several decades, with the debris clouds dispersed and much of the remaining population conscripted to scrub all affected surfaces of radiation. On the other hand, if we reach 6°C—which we could very well do sometime in the next century by burning every reachable drop of oil, gas, and coal still in the ground, as the energy industry would like to do and most Republicans would be happy to let them do—there will be no return to normalcy. It will have taken an inconceivable amount of energy to have reached 6°—to have melted the ice caps and the permafrost, supercharged hurricanes and typhoons, created large dead zones too hot for human or animal habitation, killed off millions of species, and raised sea levels dozens or hundreds of feet, drowning coastal cities where hundreds of millions of people now live. There will be no reversing these changes, even if geoengineering is more successful than it currently looks to be. We will have added two trillion metric tons or more of carbon dioxide to the atmosphere, and there will be little or no ice or snow left to reflect sunlight back into space. If we reach 6° hotter, the earth will probably stay at least that hot for a thousand years.

Cato the Elder ended his every speech to the Roman Senate with the exhortation: “Carthage must be destroyed!” The exigencies of Roman imperialism, and of everything else, now seem utterly trivial compared with the exigencies of planetary survival. And so you would think that every congressperson and senator would end every speech with “Leave the oil in the ground!” or “All energy from the sun!” But Republicans are deaf, dumb, and blind on this subject (and other subjects), and Democrats are, as they always are in a good cause, faint-hearted. It is up to intellectuals (and Scandinavian teenagers, apparently) to be importunate. Three excellent books—Falter by Bill McKibben, The Uninhabitable Earth by David Wallace-Wells, and Our Final Warning by Mark Lynas—will put anyone in an evangelizing mood and supply irrefutable arguments.

If only arguments moved the world. The raw greed and colossal financial power of the energy companies are impervious to argument. Still, argument is what intellectuals do, and it’s not always ineffectual. Silicon Valley is not beholden to Big Energy and commands similarly vast financial resources. It is not inconceivable that, lacking any positive financial incentive to ruin the planet (rather than merely colonize our inner lives), Silicon Valley might finance a popular movement and throw its weight around in Congress. If it does, or if someone else does, that popular movement will need intellectuals, above all to neutralize the pseudo-intellectuals that Big Energy has paid for several decades now to misrepresent and obfuscate climate science.

To tilt at the state and capital or to ignore them: this has always been a choice for intellectuals. Both alternatives are morally plausible, even if those who chose the first alternative have often called those who chose the second irresponsible. Nowadays, though, that charge rings hollow: next to the irresponsibility of energy executives and Republican politicians, no one else’s really counts; and the colossal conformity-producing, passivity-inducing, criticism-sidelining machinery they have constructed makes withdrawal extremely tempting and almost excusable. But as Chomsky usually replies when asked by listeners for some ground of hope: to do nothing makes the worst more likely.

All we have is a voice—and not, most of us, as penetrating a voice as Chomsky’s. But if there were ever a time to lift it in defense of our lovely, perishing planet and our sometimes lovely, endangered, self-destructive species, this is it.


[*] Correction: An earlier version of this essay incorrectly attributed the quote about the freedom of the press to Mencken.

 

Excerpted from Only a Voice by George Scialabba, published by Verso. © George Scialabba 2023.