When Andrew Sullivan, the big bearded granddad of political blogging, announced his retirement in February, the Internet was quick to perform an autopsy—as usual, slightly before its patient had died. Blogging was finished, everyone agreed, or at least it was looking wan and palsied. But what exactly was it that the pundits had gathered to eulogize? A publishing platform? The art of conversation? The cultivation of a loyal online community? One thing’s clear: whatever it might have been, classic blogging was a lot better than what seems to have killed it.
I have asked many friends to explain to me the charms of social media. I use these services, but I’ve never quite understood their appeal. Is it all idle gossip? A way to keep in touch? A harmless distraction, like BuzzFeed, or Weird Al?
The answer I typically get has nothing at all to do with fun or distraction. You have to use social media, I’m told—to make contacts, to curry influence, and to build awareness of whatever it is you do with your time. You might find Twitter fun or insufferable, they say; either way, it’s an indispensable promotional tool.
This glances back, somewhat wistfully, to the original webernet ideals—entrepreneurship, craft, amateurism, independence. But casting social media as a kind of people’s advertising platform makes for a shaky defense. As promotional strategies go, flinging out endless quantities of poorly edited one-liners is hardly an improvement on the efforts of Madison Avenue. The appeal of the technique is its simplicity: anyone can sign up, link, like, and tweet. But this style of self-advertisement trades sophistication for obsessiveness. Like a dupe at the slots, you keep punching buttons, hoping that persistence will lead to a lucky success.
Still, social media is certainly no worse than word-of-mouth. For most people, that’s probably all it amounts to. Personal or public, private or promotional—to you, to your dad, to your friend in the band, a year of posting hardly counts for more than chump change. The small screen of the smartphone and the stingy format of the tweet conspire to keep users from producing anything of even briefly lasting value. Nor do they encourage careful discrimination between tossed-off remarks and targeted announcements.
This enforced sketchiness is precisely the point. The true purpose of such tools is to crystallize the kind of daily nattering that has always been with us, and to transcribe and record it in a way that can be exploited by the real advertisers—the business titans, the opinion trackers, the mavens of market research.
This dubious bargain—scattershot careerism for the little guy, laser-targeted marketing for the rich—has become so well established that it’s hard to believe that only a few years ago people seriously looked forward something different. In the mid-2000s, big talkers like Cory Doctorow, Yochai Benkler, Jonathan Zittrain, Clay Shirky, Kevin Kelly, and Lawrence Lessig pushed for a new sharing economy that would celebrate collaboration, volunteerism, and costless transactions.
Crowdsourcing, back then, didn’t mean hitting up your friends and fans for handouts. It meant that people would willingly donate online efforts to something other than shopping, watching, and badinage. Optimists foresaw whole novels being produced this way, as well as games, new industries, content curation, scientific research, and even feature-length films. Savvy remixing promised to reconceive the basic nature of media, raising digital collage into an art form in its own right. The point of this burgeoning creative commons wasn’t to celebrate every pilfered video clip as a triumph of dilettantism. It was to meld unmercenary creativity, free exchange, and the human instinct for cooperation into a new cultural order.
This vaunted market of makers, coders, educators, and collaborators had its counterpart in a blogosphere addicted to debate. The essence of blogging in those days was reengagement. You found an ardent soul who disagreed with you, and you revisited the disagreement again and again. Of course, the arguments often descended into sniping and nitpicking, but the sniping and nitpicking came with accountability.
A blog was more than a stream of lightly edited thumbnail articles. It was, as the name suggests, a log, an archive, a transcript of evolving opinions, where any sloppy assertion or dubious claim could be brought to bear on a fresh disagreement. The blog framed its creator as a one-man or -woman magazine, with editorial philosophy and writerly voice entailed in one idiosyncratic ego. Most open-source projects waned or failed through lack of talent, but writing is in many ways the signal art of democracy—distinct from TV, film, software, and even handicrafts—in that, if you consume it, you can produce it. At that time, blogs looked like a sharing economy that had already arrived, a ferment of free and unconventional ideas.
There are some who assume that social media has improved on this promise, as if bandying zingers with frenemies on Twitter is as empowering—or as demanding—as running one’s own magazine. Even sardonic accounts of social-media silliness still dutifully nod to our old, familiar dream—that the ever-evolving Internet will inevitably usher in a golden age of cultural participation and democratic engagement. Instead, we’ve arrived at a bean-counting system that exaggerates the worst features of a typical mass audience—manufactured mini-scandals, empty trends, and colicky eruptions of disgruntled consumerism.
All of which has its fun side, to be sure. But if this is what blogging died for, I hope we can refrain from dancing on the grave.