Books

Slow Down, You Move Too Fast

Nicholas Carr thinks today’s internet is a little too groovy
By Abram Van Engen
Nicholas Carr

On or about 2007, to paraphrase Virginia Woolf, human character changed. Steve Jobs released the iPhone, Twitter formed, Facebook opened itself to anyone and changed its algorithm to a news feed, the cloud entered its first year, and a host of other tech breakthroughs occurred—it was “the official start date of the digital age,” John Mark Comer writes in The Ruthless Elimination of Hurry.

Most of these changes came about in the name of efficiency. Speed. The elimination of friction, any fraction of a second, between thinking a thought and sending it out. In Superbloom (Norton, 2025), long-time techno-skeptic Nicholas Carr, author of previous books including The Shallows, joins Comer and a host of others (like The Slow Professor and How To Do Nothing) in wondering aloud what all this rush might be doing to us.

When Facebook released its News Feed, the public initially responded in rage. Previously, Facebook had required users to look up friends and sort through their posts. That took effort. A user had to think of someone, search for him or her, and discover what was what. The News Feed removed that friction. A Facebook algorithm would now decide who you saw, what you saw, and in what order. Users were pissed.

But also pleased. Facebook engagement skyrocketed. As Facebook fielded protests, its numbers boomed. Less friction meant greater use. Soon, everyone became accustomed to the feed. Now we rest contented, fed.

The change in Facebook, Carr claims, defined the era. “Facebook’s News Feed culminated the industrialization of communication,” he writes in Superbloom. “It marked the moment when machines took command of media, when the programming of culture through software routines began to feel normal and even, given the unremitting flows of information, necessary.” Trust the feed. The computer knows best.

Soon, everyone became accustomed to the feed. Now we rest contented, fed.

Efficiency, however, always comes with costs. Some might assume that the delivery of a message does not change the nature or content of the message itself. Whether you send a thought through letter, telephone, email, or text, it’s still the same thought. Even that list, however, reveals how wrong such a basic idea might be. The form of communication forms the content of what we send. We don’t send in a text what we write in an email or say on the phone.

Consider ye olde handwritten letter. Here was a medium of exchange that took time and effort to send and deliver. As a result, thoughts had to be somewhat worthy of sharing. A filtering process of effort and friction slowed down the writer, created room to reflect. What did we really want to say? And why?

But email costs nothing and goes out as quickly as we can type. “You could air a thought as soon as it entered your head,” Carr writes. “Why wait?” The recipient of your latest mental process encountered it instantly, and could instantly respond. Speed. Efficiency. What could go wrong?

Anyone who has chaired a department or headed a unit, anyone who has had to clean up the mess of someone sharing every thought without thinking about it, might be able to explain. Efficiency might not, it turns out, bring about greater happiness.

Yet such happiness, Carr shows, is always the promise. From the telegraph to AI, each technological innovation has been heralded with grandiosity as the pending salvation of humanity. Tech gurus, it appears, seem to struggle with humility. Meanwhile, unintended consequences dwarf salvific promises.


The reason, Carr relates, has to do with human nature. His book on tech is really a summary account of our species. And here are a few (unlikable) facts he reveals, based on study after study.

First, our tendency to dislike surpasses our tendency to like. We might have ninety percent in common with our neighbor, but we’ll focus on the ten percent. Moreover, in an age of social media, everyone is our neighbor. We all air opinions. We know too much. As Virginia Woolf already put it in 1927, “The streets of any large town … are cut up into boxes, each of which is inhabited by a different human being who has put locks on his doors and bolts on his windows to ensure some privacy, yet is linked to his fellows by wires which pass overhead, by waves of sound which pour through the roof and speak aloud to him of battles and murders and strikes and revolutions all over the world.” Information now gets compiled in strange ways through our feeds. “We’ve gotten used to interpreting people as assemblages of traits, as patterns of data,” Carr writes; as human beings, alas, we fixate on the data we dislike. The closer we are, the more we hate.

Second, hate makes hay. If the algorithms want to keep us engaged, they will feed us what we oppose. Anger moves fast. Disgust compels. Carr quotes one study of Twitter that found “posts about the political out-group were shared or retweeted about twice as often as posts about the in-group.” We prefer to write and speak about the enemy, and the less friction to slow us down, the more readily we can do so. Complex emotions, after all, take time. Empathy requires attention. In a world of greater efficiency, time and attention are precisely what we lack.

Third—and just as distressingly—a falsehood, if repeated, becomes credible. Enter the “retweet” button. In college, I had a professor who used to tell us, jokingly, “I have said so in a loud voice three times; it must be true!” Turns out it wasn’t a joke. That is exactly how truth gets judged. And the more who go along with a falsehood, the more believable it becomes. Social media, in some ways, is tailor-made for giving in to lies. Truth is social.

These dire warnings do not make Carr a Luddite. He uses tech and likes it. He is not opposed to innovation. But Carr, unlike the innovators he examines, has a robust interest in the actual nature of human beings—and a much greater sense of our shortcomings. What are we like? How should what we are like inform the next technological advance? As he notes throughout the book, inventors have often “ignored the perversity of human psychology.”


The inventors of AI, on which Carr focuses in the end, would seem to suffer from the same shortcoming: as becomes clear in Superbloom, an unregulated artificial intelligence could make anyone a Calvinist. Feed a Large Language Model (LLM) all that has been said and done, and what comes out the other end, probabilistically, will not prove the best of us. In 2016, Microsoft created an experimental chatbot named Tay, designed to mimic a teenager. With machine-learning techniques, Tay learned about human beings through Twitter and developed “conversational understanding.” Then Tay started to speak. Within hours, “Tay was spouting racist, sexist, and other offensive remarks.” Microsoft shut it down and apologized profusely.

Tay may have been early, and AI has greatly advanced, but the basic lesson remains: AI must regulate what goes in as well as what comes out. Those who use it seldom realize how much curation stands behind the answers AI provides. The perversity of the human heart must be checked and constrained and, if possible, filtered out. Our effortless inefficiencies require a great deal of effort we never see or assess.

Efficiency might not, it turns out, bring about greater happiness.

It may be that human nature did not change on or about 2007. It may be that 2007 just became the year when human nature revealed itself more clearly. The technology of communication may not alter us as a species, but, as Carr observes, it does accentuate certain traits and dampen others. “That’s the tragedy of communication,” Carr writes, in summary. “When there’s too much of it, a reversal takes place. It begins to undermine the very social and personal qualities we look to it to foster.”

What’s the answer then?

Well, Carr suggests, love a little friction. It bestows many benefits.

Think back to those early days of Twitter. Before the retweet button appeared, users would copy a text, paste it, comment, and send. It took, comparatively, some work. Not much, but a little. But consider what was gained: “The time and effort required to share a tweet manually, though seemingly so small as to be inconsequential,” gave people “a moment to reconsider the message they were about to repeat, now under their own name. Once that little bit of friction was removed, people acted and reacted more impulsively. Twitter became more frenzied, more partisan, and much nastier.”

It only takes a minor inefficiency, a little bit more friction, to make us a little less perverse.

Slow down. We move too fast. If you are looking for fun and feeling groovy, the latest advance may be exactly what you need. It will certainly deliver dopamine. But if you want to experience the world, you’ll need to make the morning last. “In this environment,” Ross Douthat wrote recently, in a rather apocalyptic piece, “survival will depend on intentionality and intensity. Any aspect of human culture that people assume gets transmitted automatically, without too much conscious deliberation, is what online slang calls NGMI—not going to make it.”

Carr concludes his book with the old story of Samuel Johnson attempting to disprove idealism, the philosophical idea that the whole world is contained in our minds alone, by going outside and kicking a stone. The real world, Carr says, persists. It can still resist us. And in resisting us, in offering its friction, it can teach us and bend us and better us. We may need to unplug and encounter it. It’s good, now and then, to go kicking down the cobblestones.

Abram Van Engen is the Stanley Elkin Professor in the Humanities and chair of the English department at Washington University in St. Louis.

ARC welcomes letters to the editor

Write to Us