What Does a Fact Look Like?
Recently I read Frank Dikötter’s exhaustive and haunting The Cultural Revolution: A People’s History 1962-1976, and for several weeks afterward, doing as people do, I glumly repeated anecdotes from it, implying that the sudden turn to intolerance in Mao’s China in the years in question was paralleled by certain contemporary dogmas and prejudices. But I do not know for certain that any of the events Dikötter documents ever occurred; I do not know if the causes he ascribes to these events are the correct ones, or whether my adductions about the similarities between attitudes in Maoist China and the contemporary West are spurious; I do not know, strictly speaking, whether China even exists because I have never been there. Even if I had, it could always be that a vast conspiracy was underway to convince me of the existence of a phantom land called “China.”
This is no argument against positivism or in favor of systematic doubt (which is never systematic, but always tendentious, as we see in the post-structuralist suggestion that all discourses are historically determined except the one that says they are historically determined). It is simply a recognition of the limitations of the human mind as epitomized in David Hume’s dictum, “Nothing is demonstrable.” I have been thinking of this for various reasons: because of the effusion of lies, partisan insinuations, and bullshit that has trammeled political discussion in the United States over the past five years; because of the refusal of many ordinary people, but also of authorities who should know better, to allow evidence to temper instinct on coronavirus policy; and privately, because a person close to me is suffering from a form of dementia. While her grasp of what I and those close to her know to be true is irrevocably compromised, her certainty as to all the things she believes to be the case is no weaker, or less central to her sense of self, than mine. Her lapses are blatant, but they are the magnification of failings common to us all.
The definition of the fact in English as a particular instance of truth is not an old one. Well into the eighteenth century, the sense of fact as factum, something made or done, predominated over the more abstract idea of the verifiable. In its modern form, fact appears first in legal documents and theological treatises, generally in the phrase matter of fact, to be contrasted with reason and authority, long considered the only reliable sources of truth for things not seen by the naked eye. A bemused Gotthold Lessing remarks of its German equivalent, Tatsache, that this “wordling,” the time before which is well within his recollection, now appears on every other page of the newspaper. The context is germane. The reign of facts and the factual is hardly conceivable without the custom of regular news consumption, which circulates unfinished versions of reality for readers to assess. In some ways, the rise of fake news is little more than the vulgarization of the custom, inaugurated with the spread of the printed word, of choosing what one thinks is interesting, worthy of attention, right or wrong.
Adjacent to the fact is the question of factuality—a property not of things as they are, but of what can be said about things as they are. An orange is not a fact. That oranges exist, and are citrus fruits, and may be procured at the supermarket, are facts. About much that characterizes our lives, factuality never comes to the fore. I look now at the porcelain heron atop my thesaurus, I realize I have never worried over its factuality. Its existence is part of that intuitive truth Heidegger says is the basis for all abstract truth as embodied in propositions. If the heron vanishes, I will ask my wife where it has gone. If she tells me she hasn’t touched it, and hints that I am the one who has misplaced it, I will tell her, “I know it was there yesterday.” By saying I know, I am affirming that its having been there yesterday is a fact. This is a primary category of facts: the things we can attest to, having been physically present for them. To put a fine point on it, these are things we believe our senses have apprehended, which our mind has presented to us correctly, and the mental representation of which has maintained its integrity across time.
To accept evidence of one’s errors is so contrary to our psychological constitution that it generally requires an excuse.
My acquaintance with dementia was consumed lately with the thought that her camera had been stolen. “It was sitting right there on the desk in the guest bedroom,” she said. In truth, like nearly everyone, she hasn’t used a camera in more than a decade, content to snap pictures on her smartphone. The camera she was thinking of sat on a different desk in a different house in a different city, and was stolen by burglars more than thirty years ago. No one could dissuade her, any more than my wife might dissuade me that I had not misplaced the porcelain heron. Had I asked my friend for details––I didn’t, because according to doctors, it’s pointless––she would have felt, I believe, as I would were I asked to prove that Faulkner was the author of The Sound and the Fury. I have no access to Faulkner’s archives and am not even sure I still own the book. I cannot confirm there was no ghost writer. But the suggestion that I am wrong exasperates me. I know Faulkner wrote it for a fact, and saying so should suffice.
It is possible to be quite wrong about all this, though it is difficult to feel wrong. To accept evidence of one’s errors is so contrary to our psychological constitution that it generally requires an excuse. Our stance may soften if we see we were intentionally deceived, or that our conclusions were those any reasonable person would make in our place. Otherwise, we tend to dig in our heels.
Birth of a Notion
A friend asks me if I will get vaccinated against Covid during my upcoming trip to the United States. When I say yes, she volunteers that her girlfriend may refuse the vaccine because “she read they put rabies in it.” A few weeks before, a relative announced that more than a dozen residents of a geriatric facility in Madrid had died of Covid after taking the AstraZeneca vaccine. Even trawling the shadiest sources, I could not find much on a rabies conspiracy, other than a few papers on CORAVAX, a proposed Covid vaccine based on attenuated rabies virus—useful because multiple supply chains for rabies vaccines already exist, which could help the poor countries pharmaceutical companies are currently neglecting. As to the Madrid incident, there was nothing. The story seemed plucked from thin air.
The sins of memory alone fail to account for inventions of this kind. One presumes the alarmists in question did draw upon some source, but even this should not be taken at face value. The original does not, in effect, exist; it is always already filtered through the receivers’ perceptions. On its own, this is a commonplace observation, frivolous, like the question of whether a tree falling in an empty forest makes a sound. But what I contend is that these perceptual distortions have a narrative valence, that they reflect our disposition to interpret reality in narrative terms and to discard that which either violates or appears irrelevant to the story. Narratology––which began, with Hayden White’s influential Metahistory: The Historical Imagination in Nineteenth-Century Europe, to include the rhetorical features of factual narrative in its purview––opens avenues beyond the customary image of the free exchange of ideas among rational subjects for understanding how novel information is perceived and integrated and erroneous beliefs engendered.
Studies show that repeated falsehoods are often encoded in the brain as facts because familiarity is cognitively adjacent to factuality. Readers commonly incorporate knowledge from fictional sources, at times favoring it over previously acquired facts. Rhetorical schemes have convinced witnesses to perjure, innocent people to confess to murder, children to accuse their parents of Satanic ritual abuse. It may be useful here to distinguish between fiction and falsehood. That brown cows produce chocolate milk is false, notwithstanding the seven percent of Americans who believe otherwise; that they are putting rabies in the Covid vaccine is fiction, that is, falsehood presented as narrative.
For Roman Jakobson, the basic form of narrative is a temporal sequence inhabited by dissimilar but related elements. This can be as simple as a declarative sentence and as complex as The 9/11 Commission Report. In his essay “Is Factuality the Norm? A Perspective From Cognitive Narratology,” Marco Caracciolo cites research on the psychology of belief that “suggests that all utterances are initially processed as true and, therefore, factual; only later are some of them falsified and tagged as fictional.” Intuitively, this makes sense. For most of human history, communication, the proper domain of the fact, must have served primarily to transmit true information. The very notion of falsehood enters consciousness as a corrective to deficits in an individual’s conception of truth. We are told something is false when we believe it to be true, and the statements X is false and X is not true are semantically and logically equivalent.
The earliest known literature to use fictional markers of a style still familiar today are the Sumerian disputations, which appear in the third millennium BC. These allegorical, didactic stories, discussions between summer and winter or birds and fish, reveal a capacity to think as if something were true. Yet they are far from the kind of elaboration of untrue but inherently relevant details we associate both with novels and with conspiracy theories and fake news. The imaginative prerequisites are present in Sumer, but not the social organization that would make practiced reverie or systematic deception possible.
Credulousness is our default posture––suspension of belief is far harder than suspension of disbelief––and we should remember that before the popularization of radio and cinema, extensive experience with complex hypothetical worlds was the preserve of a literate minority. Moreover, the success of much that is putatively fiction––The Birth of a Nation and Jud Süss in the early days of filmmaking, but also Michael Bay’s 13 Hours or Christopher Nolan’s Dunkirk––depends on its veneer of truth, its being lifelike or telling things more or less the way they were. It is no coincidence that the virtues of these stories we are reluctant to accept as stories are the same ones characteristic of facts we refuse to admit aren’t facts.
The fact-checker fails to recognize that facts unmoored from narratives are abstractions, quickly misremembered or forgotten.
“Alternative facts” respond to an emotional demand for coherence that reality often fails to address, and when the two clash, reality suffers. The relative who referred to massive deaths in nursing homes proceeds in everything that regards the pandemic from the perspective that the danger is greater than everyone realizes and that people’s behavior is on the whole reckless. This has dictated her occasionally bemusing overreactions (like spraying a wheel of cheese with bleach after it fell to the floor), her refusal to believe in the efficacy of antibodies, her exaggeration by orders of magnitude of the quantity of new cases, deaths, and reinfections, and her dismissal of any news that gives cause for optimism. Likewise, my friend’s girlfriend has believed since late February of last year, when the first Covid cases appeared in Lombardy, that there’s something we’re not being told. Even as she has tried out and cast off dozens of conspiracy theories, she has come no closer to admitting the most reasonable conclusion: that a virus of as-yet unknown origin has infected many people, because that is what viruses do; that journalists are reporting what they know to the best of their ability; and that medical professionals by and large want people to be safe and healthy.
This is to be expected: what we call truth is rarely arrived at through verification; most often it is what confirms our preconceptions or what is attributable to sources deemed trustworthy. Attempts to correct biased perceptions embedded in the stories we’re told and tell ourselves can only succeed in part: the fact-checker fails to recognize that facts unmoored from narratives are abstractions, quickly misremembered or forgotten. This may be one reason that David Horowitz-style flip-flopping between extremes seems more common than the judicious moderation of beliefs. All forms of radicalism have readymade narratives for organizing facts, and switching between them is likely easier than trying to accommodate uncertainty. Narratives situate facts in stable and meaningful representations of the world; without them, the value of facts is anecdotal at most. As Tim Harford wrote in the Financial Times, “A simple untruth can beat off a complicated set of facts simply by being easier to understand and remember.”
Vanity of Small Differences
Systems theorist Niklas Luhmann coined the term “functional differentiation” to characterize the division of societies into distinct, partly self-governing fields. Like Max Weber’s related notion of rationalization, it reveals the opposing pressures that transform specialization into fragmentation: as the secondary or peripheral structures of a society grow more organized, more sophisticated, more determined in the pursuit of their own goals, they become increasingly alien to the ends of society as a whole, diminishing any overriding sense of purpose or belonging. Americans may be especially ill-suited to accept the subordination to experts that functional differentiation demands. Not even two hundred years separate us from the great westward expansion, and the ideal of the rugged individualist survives in endless attenuated forms. Handymen, survivalists, libertarians, and internet sleuths all embody a wilted ideal of the frontiersman who could manage just fine on his own, were it not for meddling eggheads and red tape.
Functional differentiation is indispensable to the stuff we like. Without it, there’s no TikTok, no UFC, no small-batch bourbon, no gastric bypasses, no Predator drones, no Sweet Baby Ray’s Garlic Parmesan Wing Sauce. And yet gratitude for the legions of experts working away in their obscure disciplines is rarely forthcoming. Tom Nichols writes that for many in contemporary America, the status of scientists and other holders of specialized knowledge is comparable to that of technicians at the Apple Store Genius Bar. Their superior knowledge is only good for providing us with a desired service, and should they fail to do so, or presume to advise us on questions we haven’t asked, we are entitled to go on Facebook and throw a temper tantrum. This was driven home to me during a stay in North Carolina at the peak of the winter surge, when four or five Covid patients were dying each day at my sister’s small hospital. The nurses and doctors would explain to their charges the importance of lying prone, of exercising their lungs with the spirometer, of walking around to avoid blood clots. Many ignored this advice with predictable consequences. “No, I just need you to give me that Covid medicine,” they said, refusing to believe such a thing didn’t exist.
The inevitable disunity caused by functional differentiation, the pockets of ignorance cognitive biases produce, are aggravated when exploited by political actors on media platforms. I am referring less to social media, which are instruments of diffusion and cohesion, than to the TV stations, radio programs, and websites where more than 80 percent of Americans get their information. Mention of media bias in any progressive forum leads inevitably to hand-wringing about the calamitous influence of Fox News, but on a good day, their viewership comprises 1 percent of the U.S. population, and their presumptive rivals, CNN and MSNBC, are hardly redoubts of intellectual objectivity.
Partisanship in news programming does have polarizing effects, but the greater problem is in the format of the news itself. Expectations are placed upon it to supply limitless stimulating “pseudo-events,” as Daniel Boorstin foresaw in 1962:
We risk being the first people in history to have been able to make their illusions so vivid, so persuasive, so “realistic” that they can live in them. We are the most illusioned people on earth. Yet we dare not become disillusioned, because our illusions are the very house in which we live; they are our news, our heroes, our adventure, our forms of art, our very experience.
For Boorstin, a pseudo-event is planned for reproduction in the media, and it often retains only the most tenuous relationship to the facts. It must be intelligible, and in a country where even the well-educated are often drastically ill-informed, this demands a great degree of simplification. It must appear relevant, must affect or seem to affect a majority of viewers personally or else be slotted into the timeworn canon of ostensibly pertinent debates about gun rights, socialism, systemic racism, immigration, the one percent, and so on. It must be divisive, so that two people pretending to be experts can scream at each other and viewers can side with one and ridicule the other in accordance with what they already think. Lastly, it must be, if not entertaining, then important-seeming enough to claim time that might otherwise be frittered away on Instagram, or playing Fruit Ninja, or watching Tiger King, or scrolling through porn.
Social media awards conformity and conflict and does not encourage ambivalence or hesitancy.
We probably know this situation is reprehensible––this is one of the reasons we’re always so pissed off. To quote Boorstin again, “Daring not to admit we may be our own deceivers, we anxiously seek someone to accuse of deceiving us.” Critics are right to be unnerved by the consolidation of media under the power of a small number of gargantuan conglomerates, but in the matter of consumers and their delusions, the problem is as much one of diversity as of uniformity: we choose the outlet that feels truest to us, and when we don’t like what it says, we turn elsewhere. The few inconvenient truths that persist across media platforms can be explained away with the help of our pocket computers, drawing on dodgier and dodgier sources whose very obscurity comes to suggest to us they may be the bearers of the real truth the establishment doesn’t want us to know. The result is increasing numbers of people who lead a feigned or surrogate life at their jobs or with their family and friends, while their innermost concerns are anchored in a world that does not actually exist. And what unites more and more Americans across the ideological spectrum is the demand that the conclusions they have drawn from these non-existent worlds be accorded the status of facts.
On a visit home, a distant relative who lives with her husband in a tiny college town complains of political correctness run rampant. She tries to be considerate, but she doesn’t know whether to call them black or African American. She is deeply troubled by a dilemma I am certain she has never experienced in person and doubtless never will. Another blames a surge in Covid cases on the young people he sees walking maskless, though scientists have long known that outdoor activity is not a significant source of transmission. Someone repeats to me for the fifth or sixth time her suspicion that “the Russians messed with the Electoral College,” a theory so easily disproven that not even Keith Olbermann bothered floating it. A friend’s wife, a self-described “moderate,” bemoans the difficulty of trying to respect “both sides” when “the left” won’t even do anything to secure our elections.
All of these people live in a world of their own making, which they have in some sense chosen, even as it distresses them. All of them dwell in that state of semi-distraction and muted agitation that comes from being permanently in reach of devices that spit forth a stream of mostly insignificant but ominous-seeming novelties. Spellbound by the scandals of the Trump administration and the horrors of the pandemic, they have mistaken that feeling of edginess as an imperative to be on the alert in what they are certain are moments of world-historical importance.
Nick Land’s term “hyperstition” refers to the way delusions can produce the realities of which they themselves are thought to be the product. (Land is an odious figure nowadays, but so it goes.) The Trump election illustrates perfectly the workings of hyperstition. Dependent on an array of contingent factors, each of them probably indispensable for victory (the clown car of Republican primary candidates, a singularly unliked opponent, James Comey’s still inexplicable letter of October 28, 2016), it quickly turned into a parable about globalization, the unbridgeable divide between the two Americas, racial animus, and economic malaise. The grievances attributed to Trump’s base––wealthier and more diverse than liberals have generally cared to acknowledge––have now hardened, expanded, and projected themselves backward into collective memory, firing the spirits of a wave of farcical MAGA acolytes and sparking an earnest (?) debate in the Republican Party about whether to return to the orthodoxy of small government and low taxes or go balls-deep on ethnonationalism and rooting out the pedophile cabal.
Their wavering leads to a final consideration on what a fact feels like: in narrative, a fact has a telos. Social media has done much to diffuse and amplify post-truth discourses, but its greater significance is as an attractor for those thirsty for a vision of reality in which they get to play the hero. In its more innocuous forms, this may go no further than screenshotting a tweet of someone defending the legitimacy of biological sex and throwing it up for a legion of followers to pillory; at worst, you get the Cyber Ninjas hiring private security to protect them from Antifa while they pretend to save American democracy. It is hard to say how many people radicalized through social media really hold the beliefs they flaunt, and how firmly: social media awards conformity and conflict and does not encourage ambivalence or hesitancy. Probably much of what people claim now to believe and feel––and I am including not only the ludicrous premises of conspiracy theory but many testimonies of trauma and micro-trauma and the widespread, almost instantaneous resort to outrage that has become the standard response to disagreement––merely facilitates ersatz rites of passage, where ordinary rules are suspended and the as-ifs of myth and ritual momentarily become real. People prefer these to mundane reality because they offer a chance to star in a story about how special and consequential we are.
At present, structural differentiation has arrived at such a degree of rarefaction that even specialists in closely related fields may be at a loss to comment on one another’s work. The idea that a responsible citizen will ever master even a fraction of the knowledge in principle necessary to act as an informed and engaged participant in democracy is an illusion, above all at a time when New Monetarism and virology must compete for attention with My 600-lb Life. Then again, the facts that matter most to us are those that touch us personally, and without active attempts to foster shared culture and shared reality—neither of which is an earnest goal of American ideologues, who profit far more from divisiveness—very large and highly specialized societies make people feel lonely, alienated, and powerless. In this situation, improvised truths based on emotional appeals respond to a vital, otherwise unanswered need. The protection the First Amendment accords to this misinformation (and just plain garbage) and its proliferation on global platforms is equivalent, in epistemic terms, to that interpretation of the Second Amendment which allows any moron to amass a private arsenal in his garage. With both, the probability of reducing harm in the near term is close to nil.