Skip to content

Rage Against the Machines

The real danger of videogames isn't violence; it's swindling
Illustration by Michael Duffy.

After Adam Lanza gunned down twenty children, six staff members, and himself at Sandy Hook Elementary School in late 2012, authorities began the kind of forensic investigation reserved for airplane crashes and sites of murderous terrorism. The details of Lanza’s life become catalogues of potential deviancy. He had made his bed that December morning. His armoire held five matching tan shirts and five pairs of khaki pants. An empty cereal bowl flanked damaged computer parts on his desk. And as any veteran of America’s periodic sagas of horror and grief wrought by young white men would expect, the investigators announced they had found the black box, the clue to the riddle, salvaged from the abyss: “thousands of dollars worth of graphically violent videogames,” according to one media report, inside the Newtown home Lanza shared with his mother, whom he also killed.

The announcement played perfectly into the hands of the consensus view. After all, National Rifle Association CEO Wayne LaPierre had delivered a statement following the Newtown massacre in a desperate attempt to stiff-arm gun control regulation efforts. In it, he called out “vicious, violent videogames with names like Bulletstorm, Grand Theft Auto, Mortal Kombat, and Splatterhouse” as evidence of a “callous, corrupt, and corrupting shadow industry,” which was the real cause of violent slaughters like Lanza’s. Television news shows had fallen into line and ran segments about local Newtown children voluntarily forsaking videogames. Vice president Joseph Biden had established a gun violence task force, inviting media executives from film and game companies to White House briefings to answer for themselves.

And now shades of Wayne LaPierre’s diatribe fell across leadership-class opinion like a closing curtain, the audience murmur on the last act of the indescribable mystery. Videogames made him do it. Newtown’s aftermath offered another example of the bipartisan view that videogames are stimulants to the most pernicious real-world depravities imaginable, their fantasy violence cutting a hole in America’s soul.

The Columbine massacre, you may recall, was a watershed moment in this particular blame game. The murderers Eric Harris and Dylan Klebold were known to play Doom, the first-person shooter that effectively inaugurated that genre and that was later licensed to the U.S. military for training purposes. Adam Lanza hadn’t forgotten. Even if Lanza didn’t carry out the Sandy Hook murders under the influence of videogames, the investigation said he had “an obsession” with Columbine, a connection that allows the specter of videogames in the backdoor of the demonology.

Doom was the plaything of Harris and Klebold, but it wasn’t the first game to attract unwanted publicity. Mortal Kombat ignited controversy in 1993, six years before Columbine, over its absurdly gory depictions of hand-to-hand combat and its lethal finishing moves, called “fatalities.” And the original moral panic over violence in videogames came decades earlier, via the 1976 coin-operated game Death Race, inspired by Paul Bartel’s cult film Death Race 2000. There, two players steered vehicles around a course, attempting to run down fleeing “gremlins.” The graphics were extremely rudimentary, but in the mid-1970s, the idea of a game in which players ran cars over stick figures was enough to provoke a frenzy. How far we haven’t come.

Anybody who grew up in America can tell you it’s a pretty violent country, and every consumer knows that our mass culture was reflecting that fact long before it began spewing the stuff in videogames. So on the surface, it seems strange that special powers should be attributed to games. What gives?

One point to keep in mind is that moral outrage over videogames’ violence was possible only once they could make reasonable claims to realism—once games, like movies and television shows, were understood in terms of their content. In 1994, in the wake of the Mortal Kombat controversy, the leading videogame industry trade association established the Entertainment Software Rating Board (ESRB) as a self-regulatory rating body. The ESRB was charged to adopt, assign, and enforce age and content ratings for videogames, which, like other products of mass culture, became potential “murder simulators” that had to be regulated by a board of constituted authorities, for all the crazy reasons Wayne LaPierre enumerated.

But if there is something dangerous about videogames now, it’s not the specter of players transforming into drooling sociopaths by enacting depraved fantasies. Instead of forensically dissecting the content packaged in games, we should look closely at the system of design and distribution that’s led them out of teen bedrooms and into the hands of a broader audience via computers and smartphones. It’s not Doom or Mortal Kombat or Death Race we should fear, in other words; it’s Candy Crush Saga, Angry Birds, and FarmVille.

To understand what is really distinctive about videogames, it helps to see how their operation runs like a racket: how the experience is designed to offer players a potentially toxic brew of guilty pleasure spiced with a kind of extortion and how they profit by stoking addiction. We might remember why we looked sideways at machine-enabled gaming in the first place—because it was a mode of play that seemed to normalize corrupt business practices in the guise of entertainment. Because the industry often seems like just another medium for swindlers.

An Offer You Can Reuse

Coin-op videogaming first emerged in the 1970s in the same venues that had previously hosted other coin-based machines like pinball: bars, Laundromats, bodegas, and other humble but slightly seedy corners of everyday life. In their heyday, coin-op video games were parked in mall arcades, and, like most cash-based businesses, the cabinets came under suspicion for money laundering. In the early 1980s, though, videogames spread from the tavern to the arcade, where they became a family affair, even if arcades were seen as disreputable venues for kids and teens. Pong and Death Race gave way to Space Invaders, Pac-Man, Defender, and Donkey Kong.

It’s not Mortal Kombat we should fear; it’s Candy Crush Saga and FarmVille.

No matter its theme, every coin-op game operates according to the same basic commercial logic: you pay to play with the understanding that the game will do its best to eject you as quickly as possible. You play against both the game and the machine. Your playing, in fact, creates part of the game’s very structure: a challenge that you can understand, accept, and sometimes overcome through a combination of good fortune and expertise. Your reward is time. And just as slot machines have odds tables, coin-op videogames are programmed to distribute the reward of time in managed increments—around three minutes for an average player. You play the game, and the machine plays you. Manufacturers tuned the design of coin-op games to yield maximum “coin drop.” An arcade machine doesn’t pay out slot-machine winnings, but it does dole out its own form of gratification and “payout” as you eclipse your score and dive further and further into the game’s structure before it boots you out again. Marginally improving your performance requires another fistful of coins.

In their 1983 book Mind at Play, psychologists Geoffrey R. Loftus and Elizabeth F. Loftus pointed out that the era’s games relied on partial reinforcement, a type of operant conditioning that provides a reward intermittently. Partial reinforcement, you may recall, was the logic behind B. F. Skinner’s infamous behaviorist rat experiments. It’s also the rationale employed by casino slots. Indeed, according to the Loftuses, the earliest video arcades were designed to operate on the same principle as the slots—scheduling payments for a short-term play experience. While the content of a game might offer an initial lure to different kinds of players—women and girls, for example, were presumed to be fonder of Pac-Man than of the space combat game Galaxian or the sci-fi shooter Robotron: 2084—the real draw of videogames could be found in time-managed capitalism.

But this was always a minority view in the culture wars of the eighties, especially as videogames left the arcade. By the mid-eighties, games became media consumables like the cartridges and discs on offer from Nintendo, designed to be played in front of the home television. Coin-op games persisted (Mortal Kombat was first released as an arcade cabinet), but by the early nineties, the memory of gaming as a weird, multibillion-dollar family casino experience was a distant memory. From the industry’s perspective, the sale of consoles, cartridges, and discs for a fixed price offered a far more lucrative, predictable, and growth-oriented marketplace. Why erect a nationwide network of Huxleyan drug sensoriums, after all, when users can tie off in the privacy of their homes?

From Coin-Op to Free-to-Play to the End of the Line

The new model of videogame delivery is “free-to-play” (F2P). At first it was limited to massively multiplayer online games (MMOs) like Neopets and MapleStory, which primarily relied on kids pestering their parents to fund their accounts so that they could buy in-game goods. These games always offer the first taste for free, and then ratchet up the attraction of paying for a more robust or customized gaming environment. In 2007, Facebook released a platform for developers to make free-to-play apps and games run within the social network’s ecosystem. Then came the iPhone, the Apple App Store, and all the copycats and spinoffs that it inspired. By 2010, free-to-play had become the norm for new games, particularly those being released for play online, via downloads, on social networks, or on smartphones—a category that is now quickly overtaking disc-based games. The point is to sell, sell, sell; the games give users opportunities to purchase virtual items or add-ons like clothing, hairstyles, or pets for their in-game characters.

In 2009, Facebook gaming startup darling Zynga launched a free-to-play game called FarmVille that went on to reach more than 80 million players. It offered a core experience for free, with add-ons and features available to those with enough “farm cash” scrip. Players can purchase farm cash through real-money transactions, earn it through gameplay accomplishments, or receive it as a reward for watching video ads or signing up for unrelated services that pay referral fees to game operators. Former Zynga CEO Mark Pincus sought out every possible method for increasing revenues. “I knew I needed revenues, right fucking now,” Pincus told attendees of a Berkeley startup mixer in 2009. “I did every horrible thing in the book just to get revenues right away.”

Every horrible thing in the book included designing a highly manipulative gameplay environment, much like the ones doled out by slot machines and coin-ops. FarmVille users had to either stop after they expended their in-game “energy” or pay up, in which case they could immediately continue. The in-game activities were designed so that they took much longer than any single play session could reasonably last, requiring players to return at prescheduled intervals to complete those tasks or else risk losing work they’d previously done—and possibly spent cash money to pursue. Players were prodded to spread notices and demands among their Facebook friends in exchange for items or favors that were otherwise inaccessible. As with slots and coin-ops, the occasional calculated anomaly in a free-to-play game doesn’t alter the overall results of the system, but only recharges the desire for another surprise, another epiphany; meanwhile, the expert player and the jackpot winner are exceptions that prove the rule.

FarmVille’s mimicry of the economically obsolete production unit of the family farm, in short, proved all too apt—like the hordes of small farmers sucked into tenantry and debt peonage during the first wave of industrialization in America, the freeholders on FarmVille’s vast virtual acreage soon learned that the game’s largely concealed infrastructure was where all the real fee-gouging action was occurring. Even those who kept their wallets tucked away in their pockets and purses would pay in other ways—by spreading “viral” invitations to recruit new farmers, for example. FarmVille users might have been having fun in the moment, but before long, they would look up to discover they owed their souls to the company store.

Zynga made hundreds of millions of dollars consuming smaller developers and building a gaming empire that boiled the blood of incumbents still wedded to the hits-and-commodities model. Big game titles in the console era, such as Call of Duty and Grand Theft Auto, took years for designers to develop and for sales teams to market—and now they were being handed over to the new industry model of shipping discs in boxes and hoping for impressive first-week sales, in the same way that the film industry counts on huge opening-week box office returns for big-tent film releases. On top of that, the legacy gaming industry still had to fend off all the old culture-war complaints about violence and delinquency—accusations that miscarried against a wholesome-looking, cartoonish farming game. Meanwhile, overnight successes like Zynga managed to enjoy the media-darling status of the technology startup world. FarmVille had cows and tractors. Your mom probably played it. It was wholesome.

In the worst cases, gaming becomes a “pay to win” affair, in which the players who pay the most perform the best.

Paying a dollar for a virtual avocado tree or a reprieve to retry a level may not seem troubling. But just as the original coin-op cabinets structured their challenges to fit a new kind of gameplay, one that started only to end as quickly as possible, free-to-play games are altering the experience of games. In the worst cases, like the card battle game Rage of Bahamut, gaming becomes a “pay to win” affair, in which the players who pay the most perform the best. Developers have realized that such tactics burn out players fast, though. The fashionable games of today more often offer gentler, soft-sell proddings. Yet these soft sells are even more insidious, despite the surface impression that they are making the competitive environment somehow more forgiving.

Take, for example, the immensely popular Candy Crush Saga, a puzzle game developed by apps giant King. Its core gameplay is derived from PopCap’s Bejeweled, a popular match-three game that first gained popularity online in the late nineties. In Candy Crush, players match candies instead of gems, and each level requires the player to complete specific requirements—eliminating a specific number of a particular type of candy, reaching a score threshold, and so forth.

The early levels are a cinch, but King has carefully designed each subsequent level to become increasingly demanding. When you fail, you lose a life, arcade-style, and losing all your lives ends the game—the emotional equivalent of seeing your last dollar disappear down the gullet of a Vegas slot machine. To continue after death, you can wait a half-hour for a new life to regenerate, pester a Facebook friend to play the game (in which case you receive a new life), or buy one as an impulse purchase. Meanwhile, you can always purchase special upgrades that assist in the completion of a level. The results are remarkable, from a business perspective. Reports suggest that King makes between $500k and $850k per day from Candy Crush.*

Some free-to-play advocates reason that getting to try a game for free and then later choosing to pay a few dollars—or a few hundred—is basically the same thing as making an outright purchase of a media product or series. And free-to-play publishers insist that most players pay little to nothing; a King representative told the Guardian that “70 percent of the people on the last level haven’t paid anything.”

Of course, as with the casino gambling model, most of Candy Crush’s revenues come from a minority of habitual players.

The free-to-play structure isn’t just a business model that somehow got hurriedly tacked onto a game that might have been commercialized in any number of other ways. Rather, it’s a sophisticated new gloss on the classic playing-for-time model pioneered by the coin-op games of the seventies and eighties—only instead of coaxing pocket change from users, it extracts a kind of surplus value that, in the new digital economy, is infinitely more valuable: it embeds within the actual gaming experience the relentless quest for attention, word-of-mouth, and (ultimately) remuneration that drives virtually every other overcapitalized form of online activity.

United States of Swindles

When Zynga went public in late 2011, it failed to exhibit the rocket-ship liftoff that Wall Street had come to expect from hot tech company IPOs. Zynga shares rose from their $10 initial offering price to a high of $14.69 in March 2012 before falling hard. During 2013, the stock languished between $2 and $5 per share. The company shuttered studios and laid off workers, but for at least one central group of players with (financial) skin in the game, matters of market performance didn’t matter all that much. Thanks to the top-heavy equity structure of the venture-capital model of capital formation, FarmVille’s early investors, directors, and executives had taken advantage of secondary-market sales and new venture investment to cash out part of their positions long before the company had to disclose its financials to the SEC and the public. Even after the IPO, Zynga insiders sold off hundreds of millions of dollars in a secondary offering unavailable to the company’s employees, many of whom had been granted options and stock grant incentives as part of standard Silicon Valley operating procedure. Required SEC disclosures reveal that then-CEO Pincus cleared $200 million alone through this secondary offering.

But the truly amazing outcome of the Zynga case study is that it hasn’t changed anything about how players and investors approach the contemporary gaming market. Despite Zynga’s fall from grace, the dream of free-to-play still tempts game creators and players alike. Given the structure and history of the gaming world, it’s not hard to see why. Like most gamblers, players believe they are exceptions who will resist being duped into spending money on in-game items or energy, or else they’ll be especially market-savvy entrants who will rationalize small payments as a reasonable concession after being backed into a corner. Rank-and-file game developers, unprotected by organized labor and wary of ever-impending layoffs in an industry as fickle as it is fashionable, have resigned themselves to free-to-play as the new normal: the will of the market. As for the executives, they have embraced the trend wholesale—and why wouldn’t they, with the beguiling specter of a $200 million public-offering payday before them?

There’s also a far broader—and, as is ever the case in the gaming world, insidious—reason for the enduring appeal of the free-to-play model. As any casual student of the 2008 market meltdown and its aftermath well knows, swindling is now a common byword in American business and culture. Games publishers have come to believe that they deserve the more predictable, generous revenues that free-to-play games offer; such paydays will finally rescue them from the terminally unstable professional niches they’ve carved out in the hits-based entertainment industry.

Free-to-play games are a kind of classic racket.

As I write this, King, the developer of Candy Crush, is reportedly planning a $5 billion IPO, making assurances along the way that it won’t fall into the same chasm Zynga did after going public (mostly by virtue, it seems, of simply not being Zynga). Meanwhile, King is taking advantage of new, confidential IPO filing rules that let it hide business data it would have previously had to disclose—the same sorts of off-the-books dealings that allow tech insiders to operate surreptitiously before regulators notice. Like Wall Street, Silicon Valley is already a kind of mafia.

And in this sense, free-to-play games are a kind of classic racket. They create a surge of interest by virtue of their easy access, followed by a tidal wave of improbable revenue that the games coerce out of players on terms that weren’t disclosed at the outset. The game knows more than you could ever hope to about the stakes it presents, and it uses the logic of its own immersive environment to continue generating reasons for you to pursue its skewed stakes. The creators use your attention to build collective value that they cash in before anyone can see inside the machine that produced it. Like free digital services more broadly, the real purpose of the videogame business—and, indeed, of American business writ large—is not to provide search or social or entertainment features, but to create rapidly accelerating value as quickly as possible so as to convert that aggregated value into wealth. Bingo!

Despite all these distressing trends of upwardly distributed wealth tortured out of the market for human attention, perhaps there’s still a kind of perverse virtue embedded deep within the free-to-play trend. Games are powerful and important partly because they help us test out the limits of ordinary life. That’s why we play. And these free-to-play games allow us to feel the edges of the unholy reality of our current winner-take-all neo-Gilded Age. Indeed, the gaming economy and the financial sector have perhaps merged to the point that we need these free-to-play games, to help us see and understand the social and economic structures of the early twenty-first century. But, then again, if we do need them, it’s only because the technology industry has thrust such a profane era upon us—a form of unlicensed gambling with the house’s money that can disclose its actual character only through the artifices of play.

*Correction: This sentence previously included a comparison of King’s daily sales figures to sales figures from the first week of Grand Theft Auto’s release, but the comparison was mathematically incorrect, and so the GTA reference has now been omitted. We apologize for the error.