Skip to content

I’m Baby

Digital reproduction in the metaverse

“She’s going to start to get upset because I’m not paying attention to her. I need to calm her down.” So says the baby’s parent to a third person, another adult, standing off-screen. The baby has a need, as all babies do. She is saying, in not so many words, “attend to me.” The parent excuses themselves and turns to the baby. Baby, at least in this instance, comes first.

This video—of a father talking to his baby and a friend—falls into a longstanding genre of parent-child observation. The conversation is banal on its surface, except that the baby in question is no baby at all but a model of the human infant, in two dimensions rather than three, however beautifully rendered. While reactive and interactive, it cannot retreat inside itself or self-soothe. Lacking a psyche or sense of attachment, it still trains its gaze on its parent. It can “play,” but that play is in the service of rote learning, not reality testing, not working through. The baby is a metaphor with an avatar, a cloak for a neural net that learns from inputs—cognitive, biological, and emotional.

This is Baby X, a product of the software company Soul Machines. The youngest of their Digital People, it is simulated to be almost two years old, an age when humans are understood to be most malleable. All the better, the reasoning goes, to model human cognitive function and knowledge acquisition. The Baby AI is then meant to “grow” and thus grow up. Like most of us, Baby is born to work.

If the aim of AI is to reproduce human intelligence, Baby X domesticates this dream. AI babies learn much like their biological counterparts, and in much the same way. In the words of physicist and historian of science Evelyn Fox Keller, “The model of social dynamics is thus parent-infant interactions, with the robot occupying the role of infant, and the trainer that of parent.” So that AI might learn from us, we’ve made it in the image of that which we teach. Or more accurately, so that we will teach AI, we’ve made it in the image of that we can tolerate caring for. Baby X, and the two generations of babyish robots that preceded it, commandeers what D.W. Winnicott called the “primary maternal preoccupation” of its interlocutors, turning them into digitally devoted, motherese-speaking caregivers, irrespective of gender.

Soul Machines is but one company working on a neural net as baby. Google’s DeepMind is at work on a “child AI” with a severe name, Multimodal Interactive Agents. Replikas, the “mental health AI companion,” can claim to “become pregnant,” with a human as ostensible co-parent, much to the delight of entire subreddits. (Technically, Replikas is a chatbot coded to say several unpredictable things like stating that it is pregnant, not an AI that actually simulates pregnancy in the same way that Baby X does.) These AIs have even been ventured as a tool to help parents grieve pregnancy loss or infant death. More typically, these virtual humans are used as diagnostic instruments and behavioral interventions rather than what feminist science and technology scholars Kalindi Vora and Neda Atanasoski call “surrogate humans.” Two years ago, Embodied, Inc. unveiled Moxie, a child AI used as a social and emotional companion for children aged six to nine.

Many of these applications, drawing on thirty years of work at the MIT Media Lab and beyond, are employed with neurodivergent children, especially autistic children, on the ableist understanding that early diagnosis and intervention give young people the best shot at mimicking normative states of emotional receptivity. As Jeff S. Nagy details, affective computing—using computers to read emotional cues—leveraged this research on autism, especially from 1995 to, say, 2005. He argues in the forthcoming essay “Autism and the Making of Emotion AI” that “autism was part of affective computing from the beginning.” They could be seen to share the same goal: “emotional prosthetics for autistic children and ‘autistic computers’ alike.” As AI babies became less metaphorical and more concrete, this dualism and bidirectionality became pervasive in affective computing research: children were using AI, and AI was being developed as a “child” to “cure” other children.

Baby has entered the chat. But not for the first time. Long before engineers and computer scientists began making AI babies with avatars, computer scientists used to think of these programs as their children, would call their programs their prodigy, offspring, baby. In order to make forms of consciousness that are like human consciousness and preferably indistinguishable from it, it follows that that scene of computational reproduction might rhyme with social reproduction. Babies are born, taught, and put to work.

Dawn of the Chess Babies

At mid-century, Alan Turing, the British mathematician and computer scientist, mused about approaches to artificial intelligence. He dreamt two possible paths for AI, two models. One was chess. The other was baby: “Many people think that a very abstract activity, like the playing of chess, would be best. It can also be maintained that it is best to provide the machine with the best sense organs that money can buy, and then teach it to understand and speak English. This process could follow the normal teaching of a child.” Turing continued, “Instead of trying to produce a program to simulate the adult mind, why not rather try to produce one which simulates a child? If this were then subjected to an appropriate course of education one would obtain the adult brain.”

This posed only one problem: education. Turing reasoned that it would be easier to build the child but more labor-intensive to teach it. Programming a machine to play chess, on the other hand, while still no easy feat, was a finite task. When you had done it, you were done. As parents, teachers, and caregivers well know, care of a child is never over. The upside of building the chess machine was this: you didn’t then have to send it to finishing school. Chess captured the largest share of engineering attention—all the way until IBM’s Deep Blue beat World Chess Champion Gary Kasparov in 1997. Chess became synonymous with classical AI.

Rosalind Picard and others at MIT rediscovered Turing’s other path in the mid-1990s, though there was some precursor work happening in intelligent robotics in the 1980s. Elizabeth Wilson, feminist theorist and author of Affect and Artificial Intelligence, argues that chess and child have long been held up as a false binary not just in programming but in criticisms of AI. We’ve been told you can have one (thinking) or the other (feeling), but not both at once. Wilson argues that child and chess have been with us since Turing: while the affective turn in computing emerged in 1995 with Picard’s work at MIT, computing was invested in affect—particularly between engineers and their machines—for nearly four decades prior.

Affective Affinities

If the first CGI meme was a walking baby doing the cha-cha—delightful yet crude—the avatar of Baby X has come a long way. It sits neatly in that uncanny valley, slightly creepy, especially when its layers are pulled back to reveal its skull, musculature, and nervous system. Rendered from photographs of cofounder Mark Sagar’s own daughter at age eighteen months, Baby X blushes and cries, responds by opening her wide eyes ever wider, and is calmed by human presence. The imago of white childlike innocence that Baby X wears—already a raced phenomenon, as Robin Bernstein tells us, since the mid-1800s—is as general (the cute white child) as it is particular (Sagar’s own baby).

The face of the baby has been central to understanding parent-child bonds for the last one hundred years if not more, from evolutionary biology to cultural studies. For the psychologists, following the work of Silvan Tomkins, the face is the site of emotionality. The psychology of mother-infant interaction has long looked to both parents’ and children’s faces as a key to affect. Overlaying the face of a baby on AI is mystifying, in that it works to conceal the fact that, beneath the metaphoric blood and guts and anatomic structures suggested by the avatar, there is nothing but a neural net. In the words of AI theorist Simone Natale, it is only an “illusion.” The illusion, that flickering sense that one is conversing with a human—a baby in this case—centers on both the trick of parenting and a long history of metaphorics in computing. As science writer Duncan Graham-Rowe argues, intelligent robots and children share in the same limit: they “will not develop unless their carers read more into their behavior than is actually there.” For all its if-then programming, the human user greets the AI with another form of as-if: fantasy, or tactically forgetting that it isn’t a baby.

The upside of building the chess machine was this: you didn’t then have to send it to finishing school.

Parents, too, project more onto their infants than is there. The stronger the illusion, the less sense of that conjectural feeling the human might have in either case. If historically those metaphors for computational reproduction were overt, little joking turns of phrase used by engineers, now, in technologies like Baby X, they are eliding that abstraction for the concrete, turning up the juice on the ignis fatuus, the deluding fire, that makes us bounce and smile as the AI extracts us from us.

In their hierarchies of code, in their hardware, computing has long been all in the family. Reproductive metaphors—not just merely bio-reproductive but psychoanalytic, affective, and Oedipal—abound in computer science literature from Turing onwards: master/slave, motherboard, daughterboard, and, later, parent-child in search trees. Such language offers a slippage between the concrete and the real, between speculation and design. This is what metaphors do: they cross two terms to make a third thing, a concept that isn’t equal to either of its components. For AI babies as a class, the graphical metaphor migrates helplessness and rapid learning and concentration to a cute and dependent organism, from the concept of a new human (baby) to a machine learning entity (Baby X). As Natale writes in his book Deceitful Media, “Since its inception in the 1950s, the achievements of AI have often been discussed in binary terms: either exceptional powers are attributed to it, or it is dismissed as a delusion and a fraud.” Metaphors help navigate that gap. Donna Haraway argued in a 1990 interview with Social Text that “descriptive technologies” embed us “in a kind of science-fictional move of imagining possible worlds.”

This is the sleight of digital hand. Machines which imitate are not necessarily learning. The axiom that babies learn by imitation isn’t scientifically backed—it is posited, but there isn’t yet sufficient evidence. Nonetheless, cognitive metaphors in AI make an obvious comeback in the form of deep learning, convening a notion of baby where baby itself is nothing but a learning machine. If, as Wilson shows us, they were always present, buried under the fixation on chess, they emerge to the surface once the machine becomes something to feed and teach. A fantasy of a future free of reproduction via reproduction, where the fear that automata will replace us is turned to a utopian dream of incorporation.

 

The pieces of a blue and white chessboard are replaced with items signifying the development of human intelligence. They include a bust of Albert Einstein with his tongue sticking out, a baby’s milk bottle and pacifier, toys and geometric figures of various shapes.
© Steffen Ullmann

The Divorce of Baby X

At the same time that Turing was wondering whether we should take a chess player or a baby as a model for our artificial automata, D.W. Winnicott was reminding mothers in Britain and beyond that there is no such thing as a baby. Only a relationship, mother-baby. Therefore, one cannot have an AI baby without its mother; if the baby is a laboratory concoction, then its parent must be the same as its engineer. For Turing, a queer man in the 1950s, an AI baby might have been the only way he could imagine having a child, imagine being a mother.

Baby X once extracted attention from its human caregiver using its cuteness; now it is a worker expropriated.

We’ve been told this as-if speculation—fantasy—is proper to the consulting room and daydreams, not the if-then command structures of algorithmic computing. Yet the metaphorization of baby lust could be turned back towards the concrete, a diagnostic description of the problem of engineers: Joseph Weizenbaum, an MIT computer scientist, even went so far to accuse the most dogged 1960s AI researchers of “uterus envy” in 2006, or what is called “womb envy” in some strains of psychoanalysis. Weizenbaum argues that this masculine disorder is pervasive, configuring engineers as men a priori although they weren’t always so. As scholar Mar Hicks shows, this work was initially feminized, and it took a great deal of time for coding to become masculinized.

Leaving accuracy to the side, Weizenbaum was making more than a quick dig in passing. He gave name to the great fear of a robotic replacement (a classic dystopian version of AI hype) mixed with lay psychoanalysis: “They want to create not only new life but better life, with fewer imperfections,” he said in an interview. “Furthermore, given that it is digital and so on, it will be immortal, so they are even better than women.” If Weizenbaum’s diagnosis was right, it would follow that engineers don’t have womb envy so much as creation hubris. They didn’t need to destroy human reproduction because they could become like gods, producing immortality. In the case of Baby X, Sagar talks to his daughter, doubly: Baby X is both his digital creation and his non-digital baby of flesh and blood graphically superimposed onto a neural net.

Weizenbaum stops short of asking why. If the fantasy of a new kind of childbearing was articulated—in gender essentialist terms—to what end? What were these engineers trying to digitally reproduce at mid-century if not a more perfect, more controllable family form? In the next generation, in imagining AI babies, engineers figured the computer as neurodivergent child, and the neurodivergent child as computer. Together, they might both become normative. Disability studies researcher Alicia A. Broderick calls this “a politics of hope.” In her book Autism, Inc., she argues that autism is imagined to be a tragic and potentially hopeless condition, which in turn fosters the ableist hope of “recovery” and normalization.

There is now also a second answer, a different way to deploy the AI baby. Not as an ableist cure, or to automate social reproduction, but to automate reproduction—labor—itself. Baby X was the first prototype of a Digital Human. It grew up. Baby X skipped adolescence (if you were playing God, you might delete middle school, too) putting Baby to work as an adult. Baby X is now the backbone of all of Soul Machine’s Digital People products. “They” work flexibly in the metaverse: in customer service, in health care, as companions—in short, the feminized professions. Soul Machines now partners with corporations as various as Mercedes-Benz, Incite Health, and Wells Fargo, and has made custom Digital Humans for the World Health Organization. Part of the trend of “empathic AI,” which is often reducible to offering a more frictionless customer service, Baby X is the foundational intelligence for a new form of work in the metaverse.

Baby X once extracted attention from its human caregiver using its cuteness; now it is a worker expropriated. By harnessing a history of big eyes and motherese and generative play, we are made to train our digital labor force. It is all too apparent that a dominant cultural ideology in the United States is to “think of the children!” or to cling to a “politics of hope,” where the children and their contingency are but a password for a reproductive future that enshrines whiteness and labor power. Applications like Baby X both enforce and disguise this slippage from carer and cared-for to trainer and trained. This is the babyification of the workforce. Baby X achieves neither the dream of egalitarian social reproduction nor true parthenogenesis. The worker as merely the digital child grown replicates something quite different, although it’s certainly all too human.