Are any of you fellow Americans aware that a favored few of us could have flown to Venus a mere forty-five years ago? Even in the early 1960s, when we were still trying to keep people in orbit longer than the Soviet Union, our leaders were bruiting big ideas for the hardware we’d be sending to the moon before the decade was “out,” to use President John Kennedy’s awkward but oddly appropriate preposition in his May, 1961, speech challenging his fellow Americans to leap headlong into the space race.
One such plan, conceived a few years after JFK imposed his deadline, involved a “fly-by” around Venus and back using the same three-person Apollo command module attached to a lunar lander that sped out toward the moon. To complete the Venus run, the module would be propelled by the same Leviathan of booster rockets, the Saturn V, that was getting American astronauts closer and closer to the moon’s surface.
But for the longer trip to Venus, the command vehicle would instead dock with a modified stage (or, as the techs then called it, S-IVB) that would serve as a combination living quarters and laboratory. There was even a date set for the launch: October 31, 1973. This was the spot on the calendar that planners believed would provide the best—i.e., closest—position for an encounter with Planet Number Two. By NASA’s reckoning, the actual fly-by, assuming all went well, would have taken place March 3, 1974. By then the command module would have cast aside the S-IVB to get the rocket into full-time rotation collecting data about the conditions on the sun. The NASA brass reckoned that the module would finally splash down in the Pacific by December 1, 1974.
I took it for granted that the same midcentury energies compelling white men to reach for the stars were also forcing white people (some of them, anyway) to reconsider their presumptions of race.
I could go on to enumerate some of the scientific objectives of this manned mission (as such things were commonly referenced back then) to another world. But I can sense what some of you are already thinking, why would we or anybody else do this, then, now, or ever? Hadn’t satellites already transmitted enough data from Venus to confirm that it wasn’t exactly a promising place for humans to visit, much less colonize? In 1962, Mariner II, that doughty little robot, had grimly dispelled any remaining reveries futurists might have been nursing about a habitable, beatific “sister planet” (populated, as some folklore would have it, solely by women). Mariner II reported surface temperatures anywhere between 300 and 800 degrees Fahrenheit, and delivered estimates of Venutian atmospheric pressure running, oh, at least twenty times more than that of Earth.
So if there was no life-as-we-knew-it, no resources to gather, and no plausible military or strategic advantage in flying humans to and from Venus, why bother? Such questions eventually pitch and bend their way toward the customary arguments that Luddites and other space skeptics have always (soundly) lofted against space travel, for example: why are we still insisting on pushing our DNA into places that never asked for it? What’s the point of sending anything we make, grow, or raise into space? After all, what measurable gains had we reaped by virtue of landing American astronauts on the moon ahead of the Soviet competition—apart, that is, from the chastened twenty-first century spectacle of a host of conspiracy theorists, including a healthy contingent of celebrities and sports stars, fervently arguing that the “moon thing” was a hoax wrought on a TV sound stage?
Fair enough; on one level, the Luddites will always have a point, maybe even an advantage, in these arguments because they deal most often in what’s directly in front of us, what is plausible, and what is (or isn’t) necessary as opposed to the purely speculative. The “moon conspiracy” theories are all speculation, which means their proponents are powered by superstition. This, too, I understand without accepting, which is about as composed as I can get about them.
Besides, I too have a superstition; one I’ve been carrying with me for most of my life. Being a child of both the first space age and the civil rights movement that ran concurrently between the late 1950s and the early 1970s, I took it for granted that the same midcentury energies compelling white men to reach for the stars were also forcing white people (some of them, anyway) to reconsider their presumptions of race.
I usually cite some random (if not arbitrary) convergences as evidence: in 1947 America integrated baseball and broke the sound barrier within a span of six months. Ten years later, the Soviets launched Sputnik—barely a month after President Eisenhower sent federal troops into Little Rock, Arkansas, to protect black students integrating Central High School. In the same 1961 spring when Yuri Gagarin and Alan Shepard were the first humans to fly into space, Freedom Riders carried out their comparably hazardous mission to racially desegregate interstate bus travel.
Yes, I know that whatever was left of the civil rights movement was in near tatters by the time Apollo 8 made its epochal December, 1968, flight to the moon. But I think I can still win my argument by invoking those “Earthrise” photos taken in lunar orbit. I’m not the only one who believes that breathtaking view helped galvanize an ecology movement that would in two years establish what we still acknowledge as “Earth Day.” I did not then and do not now believe it entirely coincidental that the impulse to fly higher and faster than earthlings have ever been before is bonded with the will to alter how we perceive ourselves and our possibilities for change. So today when I consider the potential benefits of a hypothetical tour of Venus taking place about half-past Watergate, I am not thinking primarily of what there was to find out there, but what we could have discovered about ourselves.
Indeed, pause for a moment and think of Watergate in relief of all this: assuming everything else was historically in place back then, the Halloween 1973 launch to Venus would have happened eleven days after the “Saturday Night Massacre” when all hell broke loose at the Justice Department and people were starting to think seriously about impeaching then-president Richard Nixon. Yes, we could have thought to ourselves, our chief executive is a mendacious, spiteful and, quite likely, inept con. (Those were the days!) On the other hand . . . look how far and deep we’re going! Look at the chances we’re taking! All this petty-minded, mean-spirited chicanery in Washington looks even smaller from twenty-four million miles away! We’re small, too. But we’re reaching for something beyond our constricted worldview. We’re going for it!
Going for what? a Luddite detractor might well have asked, and I admit that, even in the speculative ambit of this counterfactual thought experiment, I probably couldn’t have summoned forth a satisfactory answer. But Apollo 11 astronaut Michael Collins, who radioed to the crew of the aforementioned Apollo 8 flight the auspicious command, “You are Go for TLI (Trans-Lunar Insertion)”—permitting human beings in that moment to break totally free of Earth for the first time—pretty much speaks for me now when people ask, What was going to the moon all about?
As Collins concisely summed up the Apollo project to an interviewer in the 1994 PBS documentary Mission to the Moon, “It was about leaving.”
Leaving. So much for all the Cold War rhetoric about shoring up our national prestige, the sacred imperative of vanquishing communist Russia, and the rather rushed, ex post facto bid to rationalize the economic side of the Apollo missions by appealing to trickle-down technological innovations. It was time to go, and so we did. Period. It may not be enough of an answer for those who insist we should stop spending so much money on space and use it to solve our problems on Earth. Except that our government hasn’t spent all that much of its treasure on space lately, leaving most of the bill to galactically wealthy buccaneers like Elon Musk and Jeff Bezos. In the meantime, how are we doing on that solving-our-problems-on-Earth thing? Anybody?
It arouses my nostalgia for a past that never existed; one that might have broadened our perspective on where we are in the cosmos.
Whatever this superstition toward progress is, I must have it pretty bad. One all too plain symptom is that I can’t stop staring at and replaying on YouTube a digitally animated simulation of this hypothetical Venus mission from launch to splashdown. Venus-Apollo 1, or whatever it would have been called, haunts me in a peculiarly twentieth-century sort of way. It arouses my nostalgia for a past that never existed; one that might have broadened our perspective on where we are in the cosmos—together with a long-overdue reappraisal of who we are, and could be, on the pavement. Back in 1970, Norman Mailer, toward the end of his book about Apollo 11, Of a Fire on the Moon, arrived at a rationale for space travel eerily close to where I’ve landed now. (He refers to himself by his astrological sign):
Yes, [Aquarius] had come to believe by the end of this long summer that probably we had to explore into outer space, for technology had penetrated the modern mind to such a depth that voyages in space might have become the last way to discover the metaphysical pits of that world of technique which choked the pores of modern consciousness—yes, we might have to go out into space until the mystery of new discovery would force us to regard the world once again as poets, behold it as savages who knew that if the universe was a lock, its key was metaphor rather than measure.
This sidelong convergence of the metaphysical and cosmic within the hectic modern mind also largely sums up how I’ve come to feel about 2001: A Space Odyssey, which is now marking the fifty-year anniversary of its release in the eventful year of the American Lord 1968. Admittedly, it’s taken me nearly all of that half-century to get to the point of grokking (as we sci-fi fans would put it back then) Stanley Kubrick’s singular space opera—longer, perhaps, than it’s taken NASA satellites to reach Pluto.
Be that as it may, I am now all in on Kubrick’s majestic enigma of a science fiction epic in ways I wasn’t before—which, given my besotted-ness with all things space in my teen years, now astounds my late-middle-aged self. Yet 2001’s vision of cosmic transcendence, as conceived by director Kubrick and his screenplay collaborator Arthur C. Clarke, seemed on my first viewing to be as slick and glossy as a Life magazine spread of photos taken from real-life space missions, except with cavemen and a big black slab that made noises like a cathedral chorus in anguish. In short, a gaudy, massively diverting show of light and shadow, but somewhat less than the sum of its many whirring, spinning, and floating parts. Such, at any rate, were the initial judgments of fifteen-year-old future movie reviewer Gene Seymour, who was not yet conversant with the recreational drugs widely believed in those days to, as the euphemism goes, enhance one’s appreciation of the movie.
At least I was nicer about it than many of the best film critics of the day. John Simon labeled it a “shaggy God story” while Pauline Kael and Renata Adler, in one of the few cultural debates finding them in total agreement, rhetorically scuttled Kubrick’s grand design. “Somewhere between hypnotic and immensely boring,” Adler wrote. “Trash masquerading as art,” Kael declared, adding, “The ponderous blurry appeal of the picture may be that it takes its stoned audience out of this world to a consoling vision of a graceful world of space, controlled by superior godlike minds.” (In retrospect, it’s perversely impressive to see how Kael manages to correctly nail down the movie’s essence even as she administers her snarky headlock.) Her arch-enemy Andrew Sarris was crueler still, archly dismissing 2001 as “a thoroughly uninteresting failure.”
The intervening years, with space operas and fantasy horror shows depicting all manner of exotic otherworldly beings, have somehow validated Kubrick’s less-is-more approach, placing the burden of imagination where it belongs, with bewildered, but game earthlings.
Most of these astringent critical reactions are included in Space Odyssey: Stanley Kubrick, Arthur C. Clarke, and the Making of a Masterpiece (Simon & Schuster), Michael Benson’s captivating, thorough, and perceptive “making-of” history published in sync with the film’s fiftieth anniversary. Benson also pointedly recounts the hostility the picture aroused among studio executives, celebrities and others upon first seeing what resulted from four years of painstaking, expensive, and time-consuming work. Even Clarke was rendered baffled and disconsolate after a preview screening and received a letter of sympathy from Casablanca co-screenwriter Howard Koch lamenting that “[s]o much money and brains [were] expended on such a cold, unhuman [sic], bloodless work.” Clarke, after another pre-release screening of the film’s first cut in New York that left him in tears (and resulted, according to the book, in a “recorded” 241 walkouts), recalled overhearing an executive from MGM, the film’s distributor, exclaiming, “Well, that’s the end of Stanley Kubrick.”
It wasn’t, of course. Not of Kubrick or of his movie, which was recut (with about nineteen minutes shaved away) and re-released with all its ambiguities, inferences, abstract imagery, and glossy designs intact. Soon the initial wave of bemused antagonism was matched, if not overpowered, by slow-rising ticket sales, based mostly on positive word-of-mouth among under-twenty-five moviegoers for whom, as one Variety writer put it, “visual and aural sensations have replaced words.” Meanwhile, critics such as Newsday’s Joseph Gelmis, two weeks after his initial negative review was published, released what Benson calls a “remarkable mea culpa” that proclaimed 2001 a “masterwork.” Gelmis indeed compared 2001’s prickly initial critical reception to the way that literary critics in 1851 found faults in Herman Melville’s now-classic novel Moby Dick that later generations of readers and commentators would embrace as virtues in the next century.
Even Sarris admitted seeing the movie again “under the influence of a smoked substance . . . somewhat stronger and more authentic than oregano” and was thus persuaded that “2001 is indeed a major work by a major artist.” Regardless of whether Sarris might have been kidding—it was the sixties, after all, kids—his apparent change of heart foreshadowed similar revisionist appreciations of Kubrick’s film over the ensuing decades. Now it’s hard to find anybody who doesn’t believe 2001: A Space Odyssey is a landmark movie, a masterwork. In each of the last three Sight and Sound polls of critics and directors, the film has ascended into top-ten positions as one of the most important movies ever made, along with such perennials as Citizen Kane, Vertigo, and 8 1/2.
My own revisionist enthusiasm for 2001 stems mostly from what the movie doesn’t do, as opposed to what it does. In the half-century interval between 1968 and now, dozens of science fiction movies have done almost everything with the possibilities Kubrick’s movie opened for the genre except reach for its towering inscrutability. Back when major Hollywood studios like MGM were in a rut trying to figure out what audiences wanted, foreign and independent films were at or near the center of cinematic discourse. Audiences now expect to sit back and have movies explain everything as the action rumbles along—and even in 1968, there were limits to their forbearance for movies that made them work. But there was at least tolerance—indeed, even cachet—back then for movies that upended expectations or that were otherwise slow to reveal their secrets. What we used to call “world cinema” was as much a part of a romantic, insurgent nineteen-sixties as social ferment—and a race to the moon.
So, in the spirit of world-cinema inquiry, what was that big black slab that made the prehistoric mob go—if you’ll pardon the expression—apeshit in 2001’s “Dawn of Man” opening sequence? And why did it show up eons later “deliberately buried” beneath the lunar surface? What were all those anxious voices singing about? Even the most tolerant and/or drugged-out spectators weren’t able to say upon the first, second, or even third viewing. But if they’d read Clarke’s novelization of 2001 (which wasn’t technically a “novelization” so much as a treatment of the movie written by Clarke as part of the production process before it was turned into a screenplay) they would know that the great black obelisk came from an advanced alien species attempting to goad our own toward further advancement. According to Benson’s account, Kubrick and others were working to actually show what one of those aliens looked like almost to the very conclusion of production. In the end, they didn’t. And the intervening years, with space operas and fantasy horror shows depicting all manner of exotic otherworldly beings, have somehow validated Kubrick’s less-is-more approach, placing the burden of imagination where it belongs, with bewildered, but game earthlings. Still, whether we figured out the Monolith’s origins or not, we didn’t learn all that much from it since it’s now been forty-six years (and counting) since any country has sent human beings beyond Earth’s orbit. The urge for going has long gone—even if the Elon Musks and Jeff Bezoses insist it’s coming back any time now.
In the meantime, the movie’s other memorable, less abstract object, also known as HAL 9000, haunts today’s mind-set for what it augurs about our co-dependent relationships with our computers. The idea of a super computer with malevolent designs upon our human souls was something even the most incredulous reviewers of 2001 could imagine; indeed many of them insisted that whole sequence aboard the Jupiter-bound Discovery was the only part of the movie they enjoyed, providing the dark comedy they were expecting and maybe hoping for from the director of Dr. Strangelove. Artificial Intelligence doesn’t yet hold the dominion over our human lives that HAL—poor, conflicted, overburdened, algorithm-ailing HAL—had over his doomed crew mates. But social media and its discontents are so far instilling enough uneasiness over our presumed autonomy to make one think that, in at least one respect, Kubrick and Clarke all but nailed down one dismal prophecy for the next century.
Still, I’d have rather flown to Jupiter by now. That’s just me. And I’m perfectly willing to accept the possibility that a spacefaring world would be no better than the one we live in now; that it may have retained most of the polarities between countries and tribes that in different ways threaten peace, justice, and democracy. But even the most dystopian science fiction retains a subversive belief in the future’s ability to make us different from whatever we are now. That may not be logical. Faith rarely is. And in both the things it explains and the things it leaves open for speculation, 2001: A Space Odyssey remains after a half-century the most persuasive and compelling expression of that faith; not despite its ambiguities, but because of them. The open spaces it leaves to imagination may not satisfy those who prefer whatever passes for the Well-Built Movie. Some works of art, even grand desperate voyages like those chronicled in Moby-Dick, require abandon and acceptance of even their more outrageous inventions. Not all blanks can be filled immediately.
But what would happen if we just said, “Screw it,” to any notion of moving beyond our narrow planes of existence? This, too, is a “what if” query that science fiction engages, and among the better, if somewhat drastic, depictions of such prospects is “Myths of the Near Future,” the title story of a 1982 collection by the late British fantasist J.G. Ballard (1930-2009), whose acerbic, post-apocalyptic visions of alternate worlds and surreal societies made him a near-polar opposite of the more utopian and optimistic Clarke. “Myths” is set in a dense, swampy, near-abandoned Florida coastline region where the gantries and launch pads of Cape Canaveral (referred to by its temporary name of Cape Kennedy) have been rusted and inactive for decades. Those now inhabiting the region wander aimlessly as if in a dream state. Some fly and crash airplanes. Others stay locked in motel rooms or sit outside staring at dry swimming pools clotted with weeds and debris. They suffer from what Ballard refers to as “space sickness” blending an immobilizing nostalgia for the space age with an impulse towards long-term hibernation.
“Blame [for space-sickness] seemed to lie with the depletion of the ozone layer that had continued apace in the 1980s and 1990s,” Ballard writes. It gets worse:
But always there was the exaggerated response to sunlight, the erratic migraines and smarting corneas that hinted at the nervous origins of the malaise. There was the taste for wayward and compulsive hobbies, like the marking of obsessional words in a novel, the construction of pointless arithmetical puzzles on a pocket calculator, the collecting of fragments of TV programmes on a video recorder, and the hours spent playing back particular facial grimaces or shots of staircases.
All those water-based systems available through your buddy the computer—clouds, streams, etc.—can now give you all that and more. Science fiction does make some strangely accurate calls, but I suppose I’ll settle for the versions of the past I can get on YouTube—especially the ones that take us to Venus.