The stars are not in alignment. A wave of disgruntlement has been sweeping across our country’s paragons of success. Not those masters of the universe at the absolute top of the hierarchy, to be clear—most of whom have invested a tremendous amount of capital and ingenuity in keeping themselves out of the public eye. On the whole, they are content to leave well enough alone, sated with their sky-high stock market returns and the whack-a-mole decapitation of any serious challenges to the political-economic order.
But just one rung down the ladder, a sense of victimization and resentment has begun to grip the American elite. Prompted by the pandemic-era elimination of a Seamless meal-expensing benefit, junior analysts at Goldman Sachs are, according to New York magazine, in full “revolt.” Wealthy award-winning journalists are fleeing their publications for the Wild West of Substack, convinced that editing is a form of censorship. Most recently, thirty-eight big-name Harvard faculty signed an open letter decrying the unfair treatment of star anthropologist John Comaroff at the hands of the university administrators who issued a slap-on-the-wrist sanction against him for sexual harassment in January. (Thirty-five of them eventually issued a non-apology “retraction.”) The Ivy League named-chair professoriate, not usually known for stand-taking, finally found its conscience roused by the prospect of bureaucratic circumscription of its “pedagogical” prerogatives.
Outside the Acela Corridor, the liege lords of middle-American suburbia have also embraced a put-upon affect. This was the demographic from which the Capitol rioters of January 6, 2021, were disproportionately drawn. A team of University of Chicago researchers found that the mob determined to “take America back” was stuffed with “CEOs, shop owners, doctors, lawyers, IT specialists, and accountants.” They were “middle-class and middle-aged.” Forty percent of those arrested and charged worked in white-collar jobs or owned a business; they were denizens of the sort of respectable zip codes in which American dreamers have traditionally aspired to own property. Now, the fearsome specter of Democrats winning elections has apparently placed quiet bourgeois contentment out of reach for them.
These are the little bosses, the super-employees—from Harvard Yard to Waukesha County. They have their Priuses and F-250s, their healthy portfolios, their enviable school districts. They were the winners of the New Economy of the turn of the twenty-first century. The reappearance of Gilded-Age levels of income and wealth inequality was supposed to be the price we paid to keep them happy: the stars and entrepreneurs on whom innovation and prosperity allegedly depended. And for a while they were happy. But now the persistence of the bigger bosses has begun to grate, especially since it allows subordinates to go over the little bosses’ heads with their complaints. As Jedi Master Qui-Gon Jinn observes in Star Wars Episode I: The Phantom Menace, there’s always a bigger fish.
Our workforce-wide star system—the valorization of the super-employees at the expense of everyone else—is the legacy of the consultants, management gurus, and corporate “restructurers” who sought to restore the American economy to “competitiveness” amid the doldrums of the 1970s and early 1980s.
In the conventional wisdom that crystallized among the Reagan-era business cognoscenti, the American economy had gone into a tailspin because in the decades after World War II, businesses in Europe and especially in East Asia had figured out how to replicate the American mass production system—but for cheaper, because their labor costs were lower. If an American firm had its plants churning out standardized products, it was only a matter of time before they were undercut on the market by a company making basically the same product in a factory in Taiwan for half the price.
The implication was that American business could only compete by carving out a niche for itself and its products and by staying one step ahead of foreign imitators through perpetual “innovation.” Once they’d figured out how to make your trademark product in Hong Kong, you were already rolling out something new. The apple of the consultants’ eye, extolled in countless business bestsellers, was the Minnesota conglomerate 3M, which bucked the macroeconomic trends by issuing a steady stream of new and (at least initially) unusual products, from Post-it Notes to the N95 respirator mask. The concept of “temporary monopolies,” first broached by Joseph Schumpeter and other heterodox economists of the early twentieth century, came back into vogue to describe businesses like these. The Harvard Business School “strategy” expert Michael Porter argued that the corporation of the future would only be able to compete if its businesses were adequately “differentiated” from the rest of the field—which is to say, able to attain temporary monopoly status.
In the eyes of the prophets of the innovate-and-monopolize strategy, only a handful of a company’s employees were truly valuable: the elite few who were creative, knowledgeable, and passionate enough to lead. Value at 3M came from the corporate entrepreneurs (“intrapreneurs,” as one of the ghastliest business neologisms had it) who created mini-businesses like Post-it Notes. It was not the workers who actually made the stuff, whose efforts could eventually be replicated for cheaper in factories in Asia if the innovation engine ever stalled. And it was certainly not the janitors, cafeteria employees, and clerical workers who were responsible for the daily work of maintenance at company headquarters. It wasn’t even the “middle managers,” those grey-flannel-suited bureaucratic intermediaries who, in the eyes of their contemporary critics, existed just to pass messages up and especially down the organization chart.
The diagnosis of the management intelligentsia, then, was that American businesses were underperforming in large part because the vast majority of the employees on their payroll were not actually contributing to their competitive success—“creating value,” in the era’s lingo. The doctors prescribed a corporate liposuction. “Trim the fat” became an incessant refrain in the business press. The best companies were “lean and mean.” Most middle-management jobs were simply useless and could be eliminated. Maintenance work at headquarters had to be done, annoyingly, but the services could be provided by specialist subcontractors: “Sell the Mailroom,” the writer and consultant Peter Drucker urged Wall Street Journal readers in 1989.
Even the firm’s bread-and-butter work—manufacturing, transportation, fast-food service, etc.—could be outsourced, because such “menial,” fungible operations weren’t what really counted for corporate success. Business school professors argued that companies’ “core competencies” instead consisted of the knowledge, product concepts, intellectual property, brand materials, and so on that were housed in the brains of those hallowed few executives whose jobs actually created value. The former McKinsey analyst and bestselling management guru Tom Peters summed up the new strategic wisdom: “Subcontract everything but your soul!” Enabled by a new regulatory laxity, access to torrents of liquidity from institutional investors like pension funds, and a growing shareholder impatience with status-quo corporate management, private equity raiders and their allies in the business school world dramatically remodeled the American firm in accordance with these principles. An economy increasingly bifurcated into winners and losers was the result.
If you were lucky enough to stake out a place in the soulful elite, you’d have it made. Executive compensation skyrocketed. Ordinary salaries, the Harvard Business Review complained in 1990, incentivized executives to “act like bureaucrats rather than the value-maximizing entrepreneurs companies need to enhance their standing in world markets.” Instead, the new wisdom held, executives should derive a much larger chunk of their compensation from stock options: they could make a bonanza if and only if they made the kind of tough strategic decisions that created shareholder value. A new breed of celebrity CEOs at legacy firms—Lee Iacocca at Chrysler, Jack Welch at GE—stepped onto the scene, winning handsome remuneration as well as plaudits from management experts for “trimming the fat” at their companies. (Iacocca laid off fully half of Chrysler’s workforce when he first took over; at Welch’s GE, the toll was closer to a quarter of all employees in his first four years.)
The stars who comprised the firm’s lean muscle could expect not only lucrative salaries but a greater degree of workplace freedom, compared with many of the corporate managers of yore. If you regarded yourself as an entrepreneur rather than a bureaucrat, and if you earned the right to be regarded that way by your superiors, you would be rewarded with “exceptional personal autonomy,” Tom Peters promised. Highly trained “knowledge workers,” the management theorist Henry Mintzberg explained, “require little direct supervision from managers” and responded primarily to inspiration. This culture of “self-management” presumed a fanatical work ethic and appetite for grueling hours (it was fun, creative work, after all, so surely you could do it for eighty hours a week!). But compared to the roboticized systems of surveillance and discipline to which those unlucky low-status workers expelled from the company core were increasingly subjected, it had a lot to recommend itself—for a while.
As private equity spread its tentacles into new domains outside the for-profit business world, executives in nonprofit and governmental organizations came under pressure to restructure their own domains and run them more “like a business.” That meant replicating the emerging divide between the charmed circle of super-employees, who enjoyed good pay and pleasant working conditions, and the unexceptional masses, who were lucky to find any work at all.
In some ways, the twenty-first-century academic labor market is the neatest illustration of the new world envisioned by the consultants and corporate raiders of the late twentieth century. A small group of tenured professors—“the last good job in America,” the late labor scholar Stanley Aronowitz once quipped—are rewarded generously for lending their celebrity to their universities’ competitive quest for top students and grant money. Employing a star professor like John Comaroff helps a school like Harvard enjoy a “temporary monopoly” position in an economy of academic prestige and power. Meanwhile the vast majority of the actual teaching at most universities is performed by non-tenure-track faculty—some 73 percent of all faculty positions, according to the American Association of University Professors. Many adjuncts, like victims of outsourcing throughout the economy, are condemned to barely scrape together a living on short-term contracts and feast-or-famine workloads: only ever too much or not enough. Classroom instruction, apparently, is no longer the contemporary university’s core competence.
Mainstream observers of economic changes taking place in the late twentieth-century eagerly anticipated the end of traditional class struggle. The working class was being converted into a surplus army of casual laborers too worn out from stringing together temporary jobs to be able to organize—breaking the last ounce of strength in the old-school unions in the process. And those pesky “middle strata” for whose allegiance workers and capitalists perpetually vied would be eliminated. Middle-class professionals who could make the leap into entrepreneurial stardom would be vaulted into the elite; those who couldn’t shape up would face proletarianization and casualization. “The rich are getting richer, and the poor poorer,” the business consultant and academic Ian Angell wrote in 1996. “The future is inequality.” He was not upset about this prospect.
In retrospect, the first tremors of dissatisfaction within the winners’ circle could be felt during the Tea Party movement of the first years of the Obama administration. Like last January’s Capitol rioters, the Tea Party was often mistaken by a credulous media for a grassroots expression of populist anger, despite evidence of its disproportionately wealthy and highly educated composition. In fact, the Tea Party was the blueprint for a strange new phenomenon: the revolt of the successful class; the rapid proliferation of an ideology of victimhood among the rich and powerful. From the Tea Party to Trump, the white supremacist ideology through which elite resentments have so often been articulated in American history has made a comfortable home for this impulse on the right and in the Republican Party. But these days, liberal stars in media and academia are clearly capable of tapping into a similar sense of beleaguerment.
It’s not hard to see why. Boosters of the Clinton-era New Economy earnestly believed their superstar knowledge workers would fuel a pattern of perpetual economic growth that would allow all of those who became rich during the late-twentieth-century era of dislocation to keep getting richer forever. Most of them didn’t expect the surplus capital sloshing around to inflate a housing bubble that would trigger a global financial crisis that would deepen inequality within the ranks of the wealthy—between the fabled 1 percent and the .1 percent—even as it widened the fissure between the wealthy and everyone else. And fewer still expected a sudden shift in cultural norms that would force the powerful, to their great irritation, to start watching what they said to and about women and people of color. And I think it’s safe to say that no one expected a pandemic that would penetrate even the most exclusive suburban enclaves.
Above all, the magnitude of the gap between the fortunes of the winners and the losers is so immense that it is hard for the successful to feel secure in their achievements, since slipping out of the top tier—however remote the chances—would be totally catastrophic, not merely inconvenient. This fear is often sublimated into a resentment that aims simultaneously downwards and upwards, as well as a paranoid conviction that the masses are allying with the big bosses to screw over the respectable middle class. Students and administrators are conspiring to regulate the speech of tenured faculty; unions and municipal policymakers are conspiring to screw over professional parents who want their kids learning in person at any cost and small business owners who bristle at mask and vaccine mandates; fraudulent “urban” voters and Democratic Party bigwigs are conspiring to rig elections and override the will of Real America.
It is crucial, then, not to mistake the super-employees’ frustration for the sort of inchoate class consciousness that leftist wishful thinking has occasionally sought to read into it. The January 6 crowd was not primarily comprised of working-class rebels who were simply expressing “legitimate concerns” inappropriately. Academics who stand with their predatory colleagues against whatever limited administrative oversight they might encounter are not comparable to journalists asking the State Department tough questions, as the Harvard Marxist Stephen Marglin recently suggested. When stars explode, the supernova radiates outward in all directions—not only toward those who are further up the hierarchy.
There are, however, movements emerging that seek not just to defend the threatened privileges of the little bosses but to abolish the segmentation of the working populace into winners and losers altogether. There are new unions fighting against the casualization of teaching labor throughout higher education; there are rank-and-file reform campaigns that are seeking to revitalize the old unions that too often conceded to the imposition of a two-tier labor market within the big manufacturing corporations. The stars can lend their support, if they wish. But they will have to get comfortable with their social inferiors making demands on them, without mistaking it for an act of oppression. Solidarity means sometimes having to say you’re sorry.