Skip to content

Q&A: Nicholas Carr on Computers, Toilet Technology, the Hollowing-Out of the Middle Class, and More

Nicholas Carr’s previous book The Shallows issued a warning about what the Internet is doing to our brains. His new one, The Glass Cage: Automation and Us (Norton, 288 pages, $26.95), warns us about what computer technology is doing to our lives—our jobs, our economy, and our sense of fulfillment as human beings within the natural world.

Are Google and GPS making us lazy and dumb? Will our self-driving cars save our lives while coldly running over our neighbor’s dog? Maybe! But I decided to skip those questions and talk to him instead about money, plumbing, car repair, and robots.

*

Your book gives an overview of the history of automation technology, from Adam Smith up to today—could you talk a little bit about the changing attitudes of the labor movement toward machinery?

There’s always been the same tension between fears about machines taking over jobs, and optimism about machines expanding the set of jobs available. Even in the work of Adam Smith, we see both attitudes there—that a lot of jobs will be rendered obsolete by factory machinery and workers’ skills will be lost, but on the other hand that the increased productivity will ultimately expand job opportunities and raise standards of living throughout the economy.

That’s been true, up until recently, when history starts to diverge a bit. That positive economic dynamic isn’t in evidence with computers as labor-saving technology. Computers are able to take over more and different tasks than factory machines were, but we’re not seeing the benefits of those productivity gains being spread throughout the economy. Instead, we’re seeing them concentrate through increased profits in the hands of a relatively small number of people. We’ve seen a kind of hollowing-out of middle-class work, and more and more polarization in income and in wealth.

Do you have an opinion about how automation technology will contribute to that disconnect in the future, between productivity and wealth on the one hand, and employment and wages on the other?

Labor markets are extremely hard to predict. Computers and technology are one determinant of the shape of labor markets, but they’re not the only determinant. And as we’ve seen very recently, in the last year or two, broad economic cycle still plays a very large role in the labor market. Having said that, I think we’re still not seeing the emergence of broad new categories of good jobs. And so I can’t rule out that happening, but as computers become more capable of replicating a wider range of human skills—not just manual skills, but analytical skills, judgment-making and decision-making skills—I think we can expect that we’ll see more erosion in the number and quality of jobs, and so I think we’re right to be concerned over the long run about a continued erosion of the middle class.

Your book is really less about the labor market overall than it is about the personal experience of work. That section where you do a close reading of the Robert Frost poem “Mowing” to talk about how the simple pleasures of physical labor can bring contentment and an awareness of oneself in the world, that part is beautiful.

But to push back on that a little bit, I couldn’t help thinking about how so much physical work is really terrible, and painful, and, historically, unsafe. Now we’re at a time in American history where no one has to pick cotton by hand, because we have machines that do that now, or empty chamber pots, because we have plumbing, and in the future, even the physically-demanding job of being a runner in an Amazon warehouse will probably be replaced by robots, so….

Well, I’m certainly not in favor of inhumane working conditions. That part is less about technology’s influence on the number and mix of jobs (even though that’s an extremely important subject) than it is about the quality of jobs and the quality of life for people who become dependent on computers. Which is a different subject from, but somewhat related to, workers’ rights and so forth.

I think throughout the book I try to make the argument that tools we use can be designed and used in two very different ways. In one way, they increase our engagement with the world, and encourage us to develop and use new talents, and push us into a greater entanglement with life—which leads to more satisfaction and more fulfillment. And that’s not just true of manual jobs; that’s true of any kind of activity—intellectual activity, physical activity. Then there’s another approach in which we simply become dependent on the technology, and we begin to lose our own autonomy. We become observers who watch monitors, rather than talented people who engage in complex ways with the world.

So, to me, that’s the philosophical challenge that we face in our personal lives and our work lives: whether these new tools are enriching our lives and enriching our work, or impoverishing our lives and impoverishing our work. So to me, that’s completely different than, you know, shitty jobs and exploiting workers, which I am opposed to.

Sure, I mean, there’s computer software, and then there’s indoor plumbing, and those have different roles in our lives.

And there’s an example of the way technology used to work: you used it, but you kind of understood what was going on. You may not have been a plumber or an engineer, but you kind of knew what the pipes in your house did. As we’ve become dependent on software, something very different is happening: we don’t see or understand how the algorithms work, and that raises the danger of manipulation. You use a software program, and your own intentions are being shaped by the people writing the software, but because it’s all hidden from you, you don’t fully grasp how. So we begin to see an erosion of agency.

This reminds me of a friend who loves to work on old cars, and who complains about how more and more cars now are totally computerized. If something goes wrong with a new car, you often can’t fix it yourself—you basically have to bring it back to the dealership.

Yes, the workings become deliberately hidden from you. Not only do you lose the ability to kind of understand and work on things yourself, but you also become dependent on the technology.

And I guess you’re not only dependent on the technology, but you’re dependent on the people who own it, or who patented it, because it’s hard to even get in there without permission.

Anything else you wanted to talk about?

I think it’s worth emphasizing the way that we’re kind of into a new era in automation, where now, because we have analytical algorithms that draw on so much data, we’re going to be constantly tempted to shift lots of professional work and intellectual judgment from human beings to computers—whether it’s doctors and nurses, or business managers, or lawyers, or government analysts—and I think beyond the hard questions that that raises about the labor market, and about where good jobs will come from in the future, it also raises some basic philosophical and sociological questions about our urge to turn over complex analyses and judgment-making to computers, simply because they’re fast and efficient.

It’s going to become very easy for us to lose sight of the unique capabilities that we humans have that computers can’t replicate—things like common sense, empathy, ingenuity. And to me, one of the big dangers here is coming to accept the apparent hyper-rationalism of the computer in all aspects of our lives, and denigrating our own skills. I hope we resist that.