How will we spend the remaining 700,000 hours of the twenty-first century? In the metered time of our own discretion, there have never been more options for our personal entertainment, nor have they ever been more freely available. We find ourselves strolling the aisles of a vast sensorium. On the shelves is a trove of experiences: video games, movies, TV shows, virtual reality, books, and comics, all prepackaged for our consumption. What had previously been accomplished for food through the distribution of supermarkets has now been done with experience itself. The recent grand opening of this supersensorium has been mediated through the screen, a panoply of icons, images, links, downloads, and videos auto-playing, which we browse through entirely at our leisure.
In June of 2018 the shares of the company most devoted to this sybaritic vision, Netflix, finally broke the $400 barrier. Since 2008, Netflix stock performed better than Amazon and Google combined. If you had bet $1,000 dollars in January of 2008 on streaming entertainment as a cultural force, you’d now be taking home a cool $100,000. In that time, too, Netflix has accumulated over 130 million subscribers, and with their data collection begun a massive production mill, one optimized by continuous A/B testing of viewer preferences and behavior, including tracking technologies that register where viewers press pause and when they let episodes run late into the night. All of this in an effort to supplant what CEO Reed Hastings has openly identified as Netflix’s number one competition: sleep.
Such abundance of choice would have been heralded as miraculous in any other age. What a rousing cry for progress that our lowly living rooms would have stupefied with their luxuries of senses even the God-like pharaohs, even the court of Versailles! Or maybe not—it all comes with a price. Who hasn’t lost days from binge-watching Netflix or deep in the dungeons of some video game? Here’s a scary, or maybe heart-wrenching, thing to consider: of our waking leisure hours, what exactly is the amount of time devoted to the consumption of experiences from the supersensorium? In 2018, Nielsen reported that the average American spends eleven hours a day engaged with media. What’s the ratio of fiction to facts in those hours? Is it 25 percent? Is it—God help us—the majority? Does anyone believe that this number is going to decrease? The technology that undergirds the supersensorium will only improve. The algorithms will grow more personalized, the experiences will become more salient, and the platforms will get faster in their delivery of content. If this doesn’t seem a problem to you, extrapolate out ten years, when every family has VR goggles in their living room, and then consider ten years after that.
At the same time, this new supersensorium contains products almost universally acclaimed as being Art with a capital “A”: dramas like Mad Men, The Sopranos, The Wire. Less accepted in Art’s hallowed halls are video games, but every passing day heightens the chance some game, like the Kafkaesque Dark Souls, the philosophical Planescape: Torment, or the mind-bending mechanics of Braid, will sneak into Art’s pantheon. And while the newcomer to the scene, VR, is still figuring out how to tell stories (or even just construct experiences that don’t induce vomiting), it’s only a matter of time until it produces works of such quality. Yet despite this newfound artistic wellspring, we should all admit that the vast majority of what lines the shelves of the supersensorium is Entertainment—with a capital “E”—for otherwise we wouldn’t feel a gnawing guilt so great most of us avoid consciously calculating how our time is actually spent.
The supersensorium presents serious problems, especially if you happen to be someone who likes and maybe even produces fictions. Not that these problems are entirely new. In a letter to a friend, a thirty-one-year-old Tolstoy wrote:
I shall write no more fiction. It is shameful, when you come to think of it. People are weeping, dying, marrying, and I should sit down and write books telling “how she loved him”? It’s shameful!
If that was Tolstoy’s judgment of himself, what might his fiery judgment be of our now endless ways of telling “how she loved him”? Indeed, it’s the scale of the supersensorium that makes it a problem of the modern age, pushing to the fore old questions about the purpose of fictions. Why do humans desire these petite narratives that we gobble up like treats? What’s the origin of this pull toward artifice, a thing so powerful we might even call it an instinct? Is it virtue or vice? And if it can be a vice and technology is making it easier and easier to while away our lives this way, a reasonable person has to ask: Why add to the sensorium? Why take away from the real when the real is already back on its heels, and behind it, a cliff?
Dream a Little Dream
The first fictions appeared some 200 million years ago in the Mesozoic minds of mammals and birds. Small creatures, newly differentiated, they stole whatever sleep they could under the rule of the dinosaurs, and there in burrows or high in nests they fitfully hallucinated experiences that never happened. They dreamed. Dinosaurs, if they were anything like modern reptiles, were dreamless. Turns out the sandman only visits mammals and a smattering of birds. Perhaps a few non-chordates as well, like the spineless but neurally impressive cephalopods. But for most of the animal world, for the reptiles and amphibians and fish, there is nothing but reality.
To understand why we as upright apes are so drawn to facts that aren’t facts, to events that never happened, to fictions, we have to go back to our ancestors and ask: Why did dreaming start in the first place?
While receiving my PhD in neuroscience I happened to work in the same lab as some of the top sleep researchers in the world, so I’ll personally attest that no one fully knows why dreaming evolved, nor when it did. And since animals cannot verbally report like humans can, we can’t be completely certain animals dream at all. But consider your dog asleep in its dog bed, legs kicking as its closed eyes shift about rapidly, its stomach moving up and down, radiating comfort with its breathing and twitching life-force—could the most heartless follower of Descartes condemn such a thing to automatism?
A long evolutionary history of dreaming is rendered scientifically plausible by the homology of its biological nature. Sleep consists of approximately ninety-minute cycles, and further stages within those cycles, with dreaming usually occurring during the stage marked outwardly by rapid eye movements (REM sleep). In each cycle before reaching REM the brain must descend through non-REM sleep (NREM) where slow waves of activity traverse the cortex and leave behind periods of deep silence in their wake. If shaken from this stage sleepers will report only a deathlike nothingness. Sensory information, most of which enters the brain through the way-station of the thalamus, is blocked or gated by selective thalamic inhibition during sleep. As the brain descends deeper and deeper into this nothingness there is an internal biological clock secretly ticking away. On completion of its countdown the clock triggers a change in the neuromodulatory milieu of the brain via the firing of acetylcholine neurons, promoting wake-like neural dynamics in the brain. Also triggered by the clock, gamma-aminobutyric acid (GABA) neurons inhibit the voluntary muscle output pathways of the cortex. The result to the body is atonia. Paralysis. It’s the locking of an awake brain in a sensory deprivation tank of flesh and bone. Without this paralysis we would act out our dreams and nightmares. For those with rare sleep disorders muscle output isn’t inhibited, and they often injure themselves as their fictions become reality.
How dreaming occurs, given the neurobiological set-up of REM, is not what’s difficult to explain about dreams. After all, hallucinations are common in real sensory deprivation tanks. Deprived of bottom-up input from the senses, dreaming seems to be the natural state of the brain; it’s natural because there isn’t much of a difference between everyday perception and dreams. To an electroencephalogram picking up brain waves, the two states aren’t readily discernible: in fact, it’s waking perception that requires an explanation. Waking consciousness is a dream, but one that happens to correspond to reality, mainly because its sources are our sensory organs. Our eyes, ears, skin, noses, all save us from solipsism merely because they have been tuned by evolution so finely that the dream of our life correlates with the state of the world. Our waking life is merely an appropriately selected (in all senses of the word) hallucination. So behold, Leibniz! Your conception of the universe as two clocks, one mental and the other physical, ticking away in divine synchrony, is true. But the clocks are set to the same time not by a deity but by the Darwinian forces of evolution.
We have to go back to our ancestors and ask: Why did dreaming start in the first place?
The connection between dream and wake is so close, in fact, that the transition to wake, if allowed to occur naturally and spontaneously in the absence of alarm clocks, is almost always from REM. It is like an already online consciousness gets off to a running start by swapping out random sources with real input from sensory organs. What a lucky dream that last one is, the one that gets to be extended across the whole day, that gets to include the quotidian, the agony and ecstasy, the small pleasures and little horrors of a normal human’s waking hours, before each dream of a day ends with our heads hitting the pillow once more.
The more difficult question: Why? What’s the point of dreaming? The mystery is so deep that some scientists still take seriously the null hypothesis of sleep: that sleep doesn’t do anything at all. Nevertheless, there’s a lot of evidence that sleep is healthy for you, but most is correlational. It’s probable that the housekeeping tasks of the brain are performed during sleep. Neurons are messy, squelching, erupting entities, and they generate lots of intercellular trash. In 2012, Dr. Maiken Nedergaard (and her associated colleagues) showed that cerebral spinal fluid may flush through the brain during NREM, clearing out the bio-detritus leftovers of thought. It’s as if the brain has its own wash cycle every night. Another hypothesis on the purpose of sleep has been advanced by my graduate school advisors Giulio Tononi and Chiara Cirelli. They think that during NREM the synapses of neurons are all altered, shrinking in size universally and to a similar degree, such that each morning the stage is set for the learning that will inevitably strengthen and grow them in size throughout the day.
REM and dreaming are even more elusive for science; some still believe it is merely an epiphenomenon that comes at the witching hour, unearthly but irrelevant. Yet there’s evidence dreaming is a biological necessity. If you’re dream-deprived you will experience “REM rebound” over the next few days as your brain crams in as much REM as possible. A scientist clever (and perhaps cruel) enough can take this to extremes. Hook rats up to an electroencephalogram with an automated program that can differentiate between NREM (big brain waves) and REM (wake-like, smaller brain waves). Put the rats on a treadmill a bit above some water. When the rats sleep, if it’s NREM, allow the platform to stay motionless. If they show the signs of REM, start the treadmill and force them to wake up to stay out of the water. Dream deprivation. A team at the University of Chicago did something like this to rats back in the eighties. After a few days, the rats began to lose weight. After a few weeks, they began to jaundice. Even their fur turned yellow, then their eyes. Their yellowing paws developed lesions. And after a few weeks, every rat dropped dead from not dreaming.
Your Inner Kafka
Given that death arrives for the non-dreamer, it would seem that some hypothesis about the purpose of dreams is obviously necessary. Historically, oneirology (the study of dreams) is most strongly associated with Freud, but there’s strong scientific consensus that few if any of Freud’s theories hold up. Instead, the current hypotheses are centered on the role sleep and dreaming might play in memory consolidation and integration. Some scientists think the brain replays the day’s events during dreams to consolidate the new memories with the existing structure. But the imagined positive role dreams play in memory consolidation or memory integration captures nothing about the phenomenology of dreams. We need a face-value theory of dreams, one that looks at their ubiquity (every night) and can explain their content, which is narrative and fantastical. Many dreams could be short stories by Kafka, Borges, Márquez, or some other fabulist. Every human, even the most unimaginative accountant, has within them such an author, a surrealist scribbling away at night. Could it be precisely the fabulist and fictional aspect of dreams that reveals their true purpose?
Could it be precisely the fabulist and fictional aspect of dreams that reveals their true purpose?
Let me explain. You have an experiential worldline, which is a tracking of your experience through the space of possible experiences. Imagine a vast abstract space wherein the different degrees of freedom are sensations: hot to cold, distant to close, bright to dark; and also proprioception, smells and tastes, even representations. All manner of things which you can feel, classify, perceive. Every conscious experience you have is some point in this vast statespace of possible experiences. Every frame of every movie you could see, every meal you could possibly eat, every note you could hear—all exist somewhere in the statespace. A single day forms a particular collection of points in it, a trajectory. Your whole life will unfold as an exploration of this space, moving through a cathedral so multitudinous in its dimensions it would require transfinite mathematics to even begin to define.
There is some material mirror to all this—a neural statespace that matches the experiential statespace point for point in its dimensions and trajectory. The mysterious relationship between the two abstract spaces is otherwise known as the mind-body problem. A scientific theory of consciousness, the holy grail of neuroscience, would take the form of a universal mapping function, such that for any particular neural state one could, at least in theory if not in practice, read out some experiential state, and vice versa.
If viewed from the outside, the experiential worldline of a reptile might trace over and over again the same paths, while you’d notice that a mammal’s worldline would be more regularly frenetic. Why? Dreams are like random walks of experiences—a form of exploration. To frame dreams this way makes it more obvious why they matter: dreaming allows us to explore the experiential statespace in ways that deviate from waking life. And this is a good thing, even from a strictly evolutionary perspective. The utility of dreams is that they do exactly this. The purpose of dreams is the dreams themselves.
Consider those experiences which contain sensations and scenarios that are rare, or maybe even fatal, in real life. Falling from a tall building. Being hunted. It’s an evolutionary imperative that your brain tests these experiences in order to represent them appropriately should they happen. And one reason we can accurately perceive such fringe events is precisely because they are not entirely novel—we’ve explored those horrific experiential landscapes, or ones close enough to them, in the safety of dreams.
It is common for dreams to involve nonsensical objects or borderline categories. People who are two people, places that are both your home and a spaceship. Dreams explore the statespace and in doing so warp and play with the categories, the dimensions of perception itself, stress-testing and refining. The inner fabulist shakes up the categories of the plastic brain. Their authoring avoids a phenomenon called overfitting. Overfitting, a statistical concept, is when a model is too sensitive to the data it’s been fed, and therefore stops being generalizable. It’s learning something too well. For instance, artificial neural networks have a training data set: the data that they learn from. All training sets are finite, and often the data comes from the same source and is highly correlated in some non-obvious way. Because of this, artificial neural networks are in constant danger of becoming overfitted. When a network becomes overfitted, it will be good at dealing with the training data set but will fail at other similar data sets. All learning is basically a tradeoff between specificity and generality in this manner.
The most common way to get around the universal problem of overfitting is to expand the training set. But for real brains, the learning process that produces our experiential landscape relies on the training set of life. That set is limited in many ways, highly correlated in many ways. Life alone is not a sufficient training set for the brain. Dreams prevent our brains from overfitting our experiential statespace, blurring categories and taking unlikely trajectories. The fight against overfitting every night creates a cyclical process of annealing: during wake the brain fits to its environment via learning; then during sleep the brain “heats up” through dreams that prevent it from clinging to suboptimal solutions and models.
In this view, dreams are the exercise of consciousness. It might even be that our experiential statespaces are use it or lose it, just like muscle mass. The dimensions are always shrinking, perhaps through some neural mechanism, perhaps worn down by boring and repetitive days. The imperative of life to minimize metabolic costs almost guarantees this. The opposite of the expanding material universe, our phenomenological universes are always contracting. Dreams are like a frenetic gas that counteracts this with pressure from the inside out.
Dreaming, then, isn’t about integrating new memories or processing the day’s events; it’s rather a necessary technique for ensuring a healthy waking consciousness, one that can navigate possible experiences. And it’s the banality and self-sameness of an animal’s days that evolved the inner fabulist—here originates our need for novelty, and, for some, novels.
Outsourcing our Dreams
From an evolutionary perspective, it’s rather amazing humans are willing to spend so much time on fictions. In Denis Dutton’s The Art Instinct, he imagines a version of humans that evolved to love only true facts, not imaginary stories:
If humans loved only true stories, there would be no philosophical “problem of fiction,” because there would be no intentionally constructed fiction in human life. . . . We could be expected to react to known-to-be-untrue stories and made-up fantasy much as we react to uselessly dull knives or, worse, the smell of rotting meat.
Why did we evolve to be so different from these hypothetical truth-loving humans? Why are we so fascinated by things that never happened?
In a sense, the fictions the entertainment industry (on one end) and artists (on the other) are in the business of producing are consumable, portable, durable dreams. Novels, movies, TV shows—it is easy for us to suspend our disbelief because we are biologically programmed to surrender it when we sleep. I don’t think it’s a coincidence that a TV episode traditionally lasts about the same thirty minutes in length as the average REM event.
This hypothesized connection explains why humans find the directed dreams we call “fictions” so attractive and also reveals their purpose: they are artificial means of accomplishing the same thing naturally occurring dreams do. Just like dreams, fictions keep us from overfitting our model of the world. Since society specializes for efficiency and competency, we began to outsource the labor of the internal fabulist to an external one. Shamans, and then storytellers with their myths, and then poets, writers, directors—all external dream makers, producing superior artificial dreams. The result is that a human equipped with modern experiential technology (e.g., TV, novels), can gain the benefits of dreams even during the day.
What’s more, artificial fictions are more structured than natural ones. Joseph Campbell thought all narratives were a form of “the monomyth,” or the “hero myth.” It starts with some call to adventure or change, requires the protagonist to pass a guardian, perhaps meeting a helper or mentor before facing their challenges, requires a descent into the abyss, and finishes with transformation and atonement. This is so broad that it can describe Star Wars, Harry Potter, or even, with a bit of interpretation, Pride and Prejudice. Our desires and goals, our moral landscape, take just as much from the fictional as from the real. And that can be a good thing. There is a sense in which something like the hero myth is actually more true than reality, since it offers a generalizability impossible for any true narrative to possess.
As Yuval Noah Harari points out in Sapiens: A Brief History of Humankind, many of the things we normally consider real are themselves fictions. These fictions include not just religions but also companies, money, and nations. Such things exist only because everyone agrees they do. In fact, it is the capacity for mass cooperation under the influence of fictions like myths, like religions, that explains the rise of humans and their planetary dominance. In a TED Talk Harari says:
We can cooperate, flexibly, with countless numbers of strangers, because we alone, of all the animals on the planet, can create and believe fictions—fictional stories. And as long as everybody believes in the same fictions, everybody obeys and follows the same rules, the same norms, the same values.
Shared narratives solve coordination problems because everyone has the same framework. The evolutionary biologist David Sloan Wilson, backing up Harari, called this capacity for cooperation humanity’s “signature adaption.” Yet the binding power of stories applies as much within individuals as it does across them—they bind together our very selves. Shakespeare posed the dissembled nature of humans as:
All the world’s a stage,
And all the men and women merely players.
They have their exits and their entrances,
And one man in his time plays many parts.
These different parts must coherently act together; the temporal slices of a person’s life must be coordinated as if each slice were a different individual because, from the perspective of physics, they are. To organize the temporally disparate versions of us, we use a myth called a self. It creates a natural agreement among the different versions of us, enabling contiguous behavior and solving coordination problems. You are a protagonist in a story told by a spatiotemporally disparate set of individuals.
The better we understand narratives the better our ability to coordinate the fragments of ourselves that have been scattered across time. Artificial fictions serve as a set of examples, and they also allow us to randomly walk about different selves, exercising the experiential space that pertains to the governance and understanding of selves, in much the same manner that dreams do for perceptions, actions, and categories in general. In the end our artificial dreams are similar enough to natural ones, but the emphasis on selfhood and personal journeys indicate their constructed nature, their purposiveness. They avoid overfitting while also instructing, however subtly. The world is like this. A person is like this. A family is like this. Over and over again until we slowly get a model of the world, and a model of ourselves, generalized enough to match the fluidity of the world.
All of which might explain this weird obsession of ours, our sensitivity, even hunger, for stories. And why we’re so drawn to them, especially now. There is a property called neoteny, Greek for “keeping childlike traits into adulthood.” Neotenous adult animals look, and also behave, like juveniles of their species. It’s common in domesticated animals. In fact, just selecting for certain behaviors, such as friendliness with humans, can lead to physical neoteny. In a famous experiment conducted during the cold war, foxes were domesticated by Russian scientist Dmitry Belyaev. The foxes, selected just for tameability, took on the characteristic neotenous looks of puppies. Our own faces are childlike compared to other animals because we are self-domesticated in this manner.
Our current consumption of artificial dreams is really a form of culturally enforced neoteny: the development period of our very selves is extended by these technologies. Children love stories most of all, and now we, neotenous adults in the twenty-first century, love stories almost as much. A love that has been only growing for the last few centuries. Of all the predictions about the future, none say the truth: that we will act ever more like children. This isn’t necessarily a bad thing. Maybe it’s not happenstance that the majority of human progress occurred after the invention of the novel. Precisely during the time that adult humans began to act more like children and mass-produce fictions, humanity rocketed forward. Perhaps we were, in our obsession with fictions, teaching ourselves something more powerful than any collection of facts: how to be a protagonist.
A young woman I know had to stop watching television late at night. If she didn’t, then later, lying on her back in the dark, she’d begin to hear the characters talking again. Not to her. Just carrying on their own conversations, repeating themselves, even constructing new plots and scenes. Seinfeld to Elaine, Kirk to Spock, Ross to Rachel. The shows began to run themselves.
Our reaction to the screen is fundamental, physiological, and so commonplace we don’t credit its strangeness anymore. According to Tim Wu’s The Attention Merchants, when television was first introduced: “‘We ate our suppers in silence, spilling food, gaping in awe,” said one woman in 1950. “We thought nothing of sitting in the darkness for hours at a stretch without exchanging a word except ‘who’s going to answer that confounded telephone?’” This new technology paralyzed us into atonia—we became an awake brain locked in a motionless body, as if we were dreaming in the daytime.
Perhaps the latest evolution of this phenomena can be roughly dated to 2013, with the introduction of House of Cards, the thirteen episodes of which Netflix released all at once for continuous viewing by audiences. Many of whom watched it in just a day or two. By 2017, “binge-watching” was officially in the Merriam-Webster dictionary, and The Attention Merchants says that “a Netflix poll of TV streamers found that 61 percent defined their viewing style as binge watching, which meant two to six episodes at a sitting,” something I am certainly guilty of myself.
Our reaction to the screen is fundamental, physiological, and so commonplace we don’t credit its strangeness anymore.
David Foster Wallace’s Infinite Jest concerns a video so entertaining that people who begin watching it literally cannot stop, soiling themselves. There’s a scene in which a crowd is captured by it, one by one as they enter the room and catch a glimpse of the enchanting screen, until “all were watching the recursive loop the medical attaché had rigged on the TP’s viewer the night before, sitting and standing there very still and attentive, looking not one bit distressed or in any way displeased, even though the room smelled very bad indeed.”
In biology this is called a superstimulus. It’s like a hack for behavioral reward. Baby gulls cry and peck at their mother’s mouth, which is striped in red. Lower a painted stick with stripes of the reddest red and they’ll climb out of the nest in excitement. Australian beetles are so attracted to the brown backs of discarded beer bottles that they bake to death in the hot desert sun mating with them.
Humans aren’t some miraculous biological exception. Science has revealed how fragile, corruptible, and paradoxical humans are, and how old these instincts are. In his collection Winter, Karl Ove Knausgaard pens a series of letters to his unborn daughter:
We are at each other’s mercy. All our feelings and wishes and desires, our whole individual psychological make-up, with all its curious nooks and corners and its hard carapace, hardened some time in early childhood, almost impossible to crack, confront the feelings and wishes and desires of others and their individual psychological make-up. Even though our bodies are simple and flexible, capable of drinking tea out of the finest and most delicate china, and our manners are good, so that we usually know what is demanded of us in various situations, our souls are like dinosaurs, huge as houses, moving slowly and cumbersomely, but if they get frightened or angry they are deadly, they will stop at nothing to harm or kill.
Most of making it through modern life is learning when to control the urges of the inner dinosaur. When to deprive it of food, sex, immediate gratification, pleasure, and yes, artificial dreams. Already there are unnoticed superstimuli among us. Porn is a superstimuli, giving access to mates the majority would never see. McDonald’s is a superstimuli of umami, fat, and salt. The march of technology makes it inevitable that more and more things clear the jump to being biologically unrealistic.
Even a previously untouched aspect of life, social life, was dosed with a superstimulus. In Sapiens: A Brief History of Humankind, Yuval Noah Harari describes our original evolutionary circumstances:
The average person lived many months without seeing or hearing a human from outside of her own band, and she encountered throughout her life no more than a few hundred humans. The Sapiens population was thinly spread over vast territories. Before the Agricultural Revolution, the human population of the entire planet was smaller than that of today’s Cairo.
Yet in the past decade we have hooked ourselves up to not just a tribe or to neighbors or to family and friends but to a massive and endless social web in which dwells billions of humans: two billion on Facebook alone. This “social media” is frictionless, instant, enveloping. As a superstimulus it exacerbates the natural social relations of primates—there we are often reduced to our most basic instincts of aggression, othering, and gossip mongering; all this while receiving a serotonin hit impossible to maintain in face-to-face socialization due to its scale, anonymity, and ease-of-access. As one anonymous former self-described online mob leader wrote in Quillette magazine: “How did I become that person? It happened because it was exhilarating.”
With the social serotonin circuitry hijacked by a massive social web, the result is that people who consider themselves utterly cosmopolitan regularly spend their day trying to rip their enemies to shreds, taunting, laughing at, calling out, doxxing, mocking. Even scientific studies have now backed up the felt truth that our social reputations are as precious to us as our bodies, and that we feel pain in a similar way to both kinds of injury. It’s unfortunate that only real blood bleeds red; imagine if we could see the aftermath of social media attack as clearly as we see that of physical attack. There goes the Upper East Side mom jogging with her baby carriage, blood staining her mouth, teeth, hands. There goes the thin-shouldered, unintimidating gamer, his shirt a mess of gore. It would surprise you who the truly vicious in society are.
Science has revealed how fragile, corruptible, and paradoxical humans are.
Social media happened slowly, and then all at once. The supersensorium is being built even more slowly, one aisle at a time, but with each passing year Wallace’s prophetic description of the video it is impossible to look away from, called in Infinite Jest only “The Entertainment,” slouches toward birth.
Regular TV’s addictiveness is hypothesized to come from the orienting response: an innate knee-jerk reaction that focuses attention on new audio and visual stimuli. The formal techniques of television—the cut, the pan, the zoom—are thought to trigger this response over and over. TV, and many other cultural products, also amplify their addictiveness via their narrative or mythological properties (consider the omnipresent expression of the hero myth in everything from Disney movies to role-playing games). And as the supersensorium gets more and more super in its capabilities and extent, the biological urge to dream the monomyth grows to eat the world.
Art as Immunity
If I go long enough without fiction I begin to jaundice. I develop lesions on my hands and my eyes grow bloodshot and yellow. I stumble about like a starved vampire, violent for pages, movies, anything. I grew up in my mother’s bookstore, working there as a teenager, hawking fictions. And because of this I need fiction to have some sort of real-world relevance. I need it to be a solution to a problem, not the problem itself.
The human desire for superstimuli can never be vanquished; it can merely be redirected. At best, we upright apes develop an immunity to the worst and most addictive of the technologically enabled superstimuli, and an attraction to the edifying, neutral, or least damaging parts. Consider eating habits. Food might be the most typical superstimuli, with the result that over one-third of Americans are obese. From a certain evolutionary perspective, it’s miraculous this number is not higher. But as economics professor and blogger Robin Hanson says, “Food isn’t about Nutrition.” Rather, food is more about signaling than health—and the upper-class laid claim to the Whole Foods entree, avocado toast, the smoothie, scallops on the grill, sushi. The lower-class got fast food, vending machines, pizza soaking through cardboard boxes, prepackaged and processed TV meals. There is both the belief in an objective spectrum of healthiness (although it is rewritten all the time) and a corresponding association of that spectrum with class. People eat healthy for the obvious reasons, but deep down they also eat healthy because they want to present as a certain kind of person. This, despite its gross inequity, provides an immunity to the lure of caloric superstimuli. An analogous situation to the superstimuli of food has been developing in terms of media, first slowly but now so quickly it is blurring by us, starting at the biological imperative to dream, to avoid overfitting the experiential statespace, to the development of artificial fictions, then their distillation with the invention of the novel, the proliferation of the novel into movies and TV, the recent development over the last decade of the screen-mediated supersensorium that allows for endless consumption, all the way up to the newest addition to the supersensorium, VR, which has been known to leave users and developers with “post-VR sadness.”
Yet, just as the threat of The Entertainment looms, the idea of an aesthetic spectrum, with Art on one end and Entertainment on the other, is defunct. Despite our personal behavior often following such distinctions, explicitly promoting any difference between Entertainment and Art is considered a product of a bygone age. At worst, it’s a spectrum that’s been a tool of oppression and elitism. At best, it’s an embarrassing form of noblesse oblige. One could give a long historical answer about how exactly we got into this cultural headspace, maybe starting with postmodernism and deconstructionism, then moving on to the problematization of the canon, or the saturation of pop culture in academia to feed the more and more degrees, tracing the ideas, cataloguing the opinions of the cultural powerbrokers; or instead, one could focus on new media and technologies muscling for attention, and changing demographics and work forces and leisure time, and so many other things—but none of it matters. What matters is, now, as it stands, talking about Art (with a capital “A”) as being different than Entertainment brings charges of classism, snobbishness, elitism—of being proscriptive, boring, and stuffy.
When it’s merely a matter of intellectual debate, I too feel the attraction to breaking down such distinctions, discarding the spectrum. To the barricades! But when it becomes a conversation about how we spend our time, it doesn’t seem so academic—and the defense of Art appears the more radical and necessary position to take. Maintaining an Art/Entertainment distinction is necessary to lived experience because the supersensorium is arriving right now.
Without a belief in some sort of lowbrow-highbrow spectrum of aesthetics, there is no corresponding justification of a spectrum of media consumption habits. Imagine two alien civilizations, both at roughly our own stage of civilization, both with humanity’s innate drive to consume artificial experiences and narratives. One is a culture that scoffs at the notion of Art. The other is aesthetically sensitive and even judgmental. Which weathers the storm of the encroaching supersensorium, with its hyper-addictive superstimuli? When the eleven hours a day becomes thirteen, becomes fifteen? A belief in an aesthetic spectrum may be all that keeps a civilization from disappearing up its own brainstem.
In a world of infinite experience, it is the aesthete who is safest, not the ascetic. Abstinence will not work. The only cure for too much fiction is good fiction. Artful fictions are, by their very nature, rare and difficult to produce. In turn, their rarity justifies their existence and promotion. It’s difficult to overeat on caviar alone.
In advocating for an aesthetic spectrum, it’s important to note here that I don’t mean that Art can’t be entertaining (with a lower-case “e”), nor that it’s restricted to a certain medium. I’m quite sure there will be VR that achieves the status of Art, and that there have been both TV shows and video games that clear the hurdle. But no matter its form, Art cannot be assimilated into the supersensorium.
This is because entertainment is Lamarckian in its representation of the world—it produces copies of copies of copies, until the image blurs. The artificial dreams we crave to prevent overfitting become themselves overfitted, self-similar, too stereotyped and wooden to accomplish their purpose. Schlock. While unable to fulfill their function, just like the empty calories of candy, they still satisfy the underlying drive. On the opposite end of the spectrum, the works that we consider artful, if successful, contain a shocking realness; they return to the well of the world. Perhaps this is why, in a recent interview in The New Yorker, Knausgaard declared that “The duty of literature is to fight fiction.”
Artful narratives almost always have both a freshness and innate ambiguity; they represent while at the same time avoid overfitting via stereotype. A nudge in one direction and they can veer to kitsch, a nudge in another and they become experimental and unduly alienating. They exist in an uncanny valley of familiarity—the world of Art is like a dream that some higher being, more aesthetically sensitive, more empathetic, more intelligent, is having. And by extension, we are having. Existing at such points of criticality, it is these kinds of artificial dreams that are the most advanced, efficient, and rewarding.
Just as the threat of The Entertainment looms, the idea of an aesthetic spectrum, with Art on one end and Entertainment on the other, is defunct.
Entertainment, etymologically speaking, means “to maintain, to keep someone in a certain frame of mind.” Art, however, changes us. Who hasn’t felt what the French call frisson at the reading of a book, or the watching of a movie? William James called it the same “oceanic feeling” produced by religion. While the empty calories of Entertainment fill our senses, Art expands us. Which is why Art is so often accompanied by the feeling of transcendence, of the sublime. We all know the feeling—it is the warping of the foundations of our experience as we are internally rearranged by the hand of the artist, as if they have reached inside our heads, elbow deep, and, on finding that knot at the center of all brains, yanked us into some new unexplored part of our consciousness.
An explicit argument for the necessity of an aesthetic spectrum is anathema to many. It’s easy to attack as moralizing, quixotic, and elitist. But what’s essential for people to understand is that only by upholding Art can we champion the consumption of Art. Which is so desperately needed because only Art is the counterforce judo for Entertainment’s stranglehold in our stone-age brains. And as the latter force gets stronger, we need the former more and more. So in your own habits of consumption, hold on to Art. It will deliver you through this century.
 Proving anything for sure in the brain is difficult. Faced with some skepticism in the scientific community, one day Giulio called me into his office and said: “Erik, I must make them believe! And I must do so in the manner Christ convinced Doubting Thomas: through anatomy. Not even Christ’s appearance made Thomas believe, but he believed when Christ put Thomas’s hand inside his wound. So I must metaphorically take their hand and place it on the synapses themselves so that they feel them change every night.”
 I don’t find these leading hypotheses convincing. From what I can tell about contemporary oneirology, even in the leading memory-based theories the actual dreams themselves are merely phantasmagoric effluvia, a byproduct of some hazily-defined neural process that “integrates” and “consolidates” memories (whatever that really means). Yet the content of dreams themselves don’t reflect anything like this — you rarely, if ever, relive the events of the day in your sleep.
 The experiential statespace has a significant inbuilt component. Yet it’s also refined and expanded based on learning. Remember the first time you tasted coffee? It was like burnt dirt. Learning and categorization leads to discriminability, which is the fundamental ‘like this’ and not ‘like that’ function of consciousness, which means the experiential statespace is a landscape that changes as it is explored. And exploring it is our lives.
 Developing such a theory is what I worked on all through my PhD. Later I set aside the pursuit, thinking it might be centuries for an answer. But whenever I speak of experiential spaces or trajectories you can substitute in some neuroscientific translation if you desire.
 This theory fits with the evidence from human sleep research: people can find mathematical patterns easier if they are given a good night’s sleep (probably because they aren’t overfitting the problems anymore), a lack of REM sleep leads to problems in category perception (poor generalization abilities), and if you stay awake too long you will begin to hallucinate (perhaps because your perceptual processes are now overfitted).
 The risk of overfitting is greater for neural networks when what they are learning increases in complexity—perhaps then it’s unsurprising that as our world has complexified we turn evermore to fiction to “relax,” a phenomenon which might not really be relaxation at all.