Remember the Internet?

2015, as imagined by 1995
2015, as imagined by 1995

Matthew Yglesias is the latest and most intelligent blogger to drudge up a bad ’90s-era prediction about the future of the Internet. Yglesias’s target is a 1995 Newsweek article by Clifford Stoll entitled “Why the Web Won’t Be Nirvana.” He draws attention to Stoll’s opening paragraphs, which dismiss the digital realm’s then much-hyped world-changing potential as “baloney.” Back then, Stoll wrote:

After two decades online, I’m perplexed. It’s not that I haven’t had a gas of a good time on the Internet. I’ve met great people and even caught a hacker or two. But today, I’m uneasy about this most trendy and oversold community. Visionaries see a future of telecommuting workers, interactive libraries and multimedia classrooms. They speak of electronic town meetings and virtual communities. Commerce and business will shift from offices and malls to networks and modems. And the freedom of digital networks will make government more democratic.

Baloney. Do our computer pundits lack all common sense? The truth in no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works.

Consider today’s online world. The Usenet, a worldwide bulletin board, allows anyone to post messages across the nation. Your word gets out, leapfrogging editors and publishers. Every voice can be heard cheaply and instantly. The result? Every voice is heard. The cacophany more closely resembles citizens band radio, complete with handles, harrasment, and anonymous threats. When most everyone shouts, few listen. How about electronic publishing? Try reading a book on disc. At best, it’s an unpleasant chore: the myopic glow of a clunky computer replaces the friendly pages of a book. And you can’t tote that laptop to the beach. Yet Nicholas Negroponte, director of the MIT Media Lab, predicts that we’ll soon buy books and newspapers straight over the Intenet. Uh, sure.

Stoll’s exasperation at the notion of “books on disc” and his confidence that “no online database will replace your daily newspaper” are knee-slappers, to be sure. Of the industries that digital technology has felled, print media is the Goliath. The music and publishing industries, like the Philistines of old, are self-imploding in the wake. And if you scan through the utopian scribblings of early ’90s techies, you will find thousands of articles predicting these events. By dismissing the Internet’s industry-toppling potential, Stoll not only lacked foresight, he lacked good sense. When Nicholas Negroponte (who knows whereof he speaks) offers his thoughts on the direction of media technology, best to rebut with something more substantial than “Uh, sure.” (Of course, you’d sacrifice the knowing insouciance that makes Newsweek-brow editorials so much fun!)

But Stoll’s article draws attention to a fact that those of us laughing at him in hindsight often overlook. The Internet in 1995 wasn’t the Internet.

The Ford Model T resembles a Tesla Model S more than the WorldWideWeb Stoll surfed resembles the networked world we inhabit in 2013. Today, the separation between digital life and analogue life has essentially dissolved. Even the word “Internet” sounds a little quaint, and it survived several terminology purges (RIP World Wide Web, Information Superhighway, “the ‘Net,” etc.). The Internet is fast becoming for us what Christianity was for medieval Europeans: not a religion or ideology but a totalizing epistemology that is almost impossible to imagine your way out of.

Now I just use my iPhone's built-in translator.
Today I just use my iPhone’s built-in translator.

Now think about 1995: Bill Clinton was still in his first term (he an Al Gore’s emphasis on Internet connectivity in public classrooms inspired ridicule and laughs in conservative media). Many Americans had only recently purchased their first computers (if they owned one at all). And computer ownership didn’t guarantee connectivity (my cousin’s computer seemed to exist for the sole purpose of playing Where in the World is Carmen Sandiego?). AOL CD-ROMs weren’t ubiquitous, much less used as coasters, frisbees, and microwave sparklers. Given the vast technologico-cultural chasm between Stoll and us, it’s actually surprising how well he describes the Internet of 2013.

Here’s a bit of Stoll’s article that Yglesias doesn’t quote:

What the Internet hucksters won’t tell you is that the Internet is one big ocean of unedited data, without any pretense of completeness. Lacking editors, reviewers or critics, the Internet has become a wasteland of unfiltered data. You don’t know what to ignore and what’s worth reading. Logged onto the World Wide Web, I hunt for the date of the Battle of Trafalgar. Hundreds of files show up, and it takes 15 minutes to unravel them—one’s a biography written by an eighth grader, the second is a computer game that doesn’t work and the third is an image of a London monument. None answers my question, and my search is periodically interrupted by messages like, “Too many connections, try again later.”

Obnoxious tone aside, this complaint remains applicable to our post-Google cyborg existence. Even if you don’t believe the Internet is “a wasteland of unfiltered data,” you can imagine why someone might feel that way. And if this qualifies as a “prediction,” then Stoll is a digital prophet.

And rhetorically, he’s nothing compared to the Internet’s early boosters.

As the title of Stoll’s article indicates, predictions about the Internet’s future were highly exaggerated. It was supposed to change and enhance all realms of human experience – a “nirvana” (very ’90s). Living room virtual reality was always just around the corner. The Internet would eliminate trips to the doctor, to school, to the library, to the post office (in the ’90s, we were very concerned with short-distance trips). The Internet would cause, or correct, the Y2K apocalypse. These predictions, like all predictions, were framed by their moment. Stoll’s article imagined the Internet solving (or failing to solve) ’90s problems.

Even in 2013, nobody can imagine reading a digital book…if your idea of a book is fixed in 1995.

But Internet technologies continued to develop over time, solving newly emerging problems, then responding to the problems created by the solutions. Speculation about the future persisted among the keyboard class, but few major developments were accurately predicted by anyone outside Cambridge, Silicon Valley, the U.S. military, or the major telecommunication companies.

Today, the predictions that get the most attention are those that dramatically underestimate digital technologies. In the embarrassingly late year of 1998, Paul Krugman famously predicted that by 2005, “the Internet’s impact on the economy [will have] been no greater than the fax machine’s.” As for social networking, he argued that “most people have nothing to say to each other!” And then there’s this:

To be fair, Yglesias does give Stoll some credit:

…the Web hasn’t lived up to the full maximum capacity of its dreams. Relatively few people are full-time telecommuters, for example, and efforts to genuinely replace traditional teaching with online instruction have been disappointing so far. But already we’re at a point where computer networks have changed the way government works enough that the inability to execute a major IT procurement initiative correctly has been the dominant political story of the fall. The publishing and media industries have been completely transformed. Almost everyone who learns things uses the Web as a useful supplement to classroom instruction.

So basically, the publishing industry is the big trophy, the slayed giant. Other industries and institutions are simply modified. The mechanics of corporate structure have changed, but the basic structure remains the same. As for education: politicians and digital engineers have promoted the Internet’s potential to dramatically reform education since the early 1990s. Yet a widespread reform has yet to occur. (Not for lack of experiments, which have so far produced very mixed results. Even Sebastian Thrun admits that MOOCs aren’t working.) And the government is still the damned government, even if Obama uses websites and Chuck Grassley uses Twitter.

Does this mean Internet-based reforms will never occur? Of course not. But despite the radical change the Internet has wrought, Stoll’s pessimistic 1995 forecast appears prudent and consevative at worst, prescient at best.

Meanwhile, the lesson Yglesias draws from Stoll’s one or two premature and comically wrong predictions is troubling. “I think [Stoll’s article] serves as a useful antidote,” he writes, “to a certain genre of writing popular on the Internet today where people poke fun at excessive techno-hype from Silicon Valley types.” I’ve dabbled in that genre, but even if I wrote these posts while teaching MOOCs from my tax-free seastead, I’d still argue that Yglesias is drawing the wrong conclusions from Stoll’s piece. True, Stoll’s article is editorializing run amuck (remember back when someone was an expert on everything because they worked for a newspaper?). His tone is grating. But he was responding to the ’90s version of what Yglesias calls “excessive techno-hype.” The hype is more dangerous than the predictions: it leads to bubbles and recessions, to booms and busts. If given the choice between joining the hype or poking fun, I’ll poke fun – even at the risk of making a few ridiculous predictions.

Advertisements

5 thoughts on “Remember the Internet?”

  1. “The [Usenet] more closely resembles citizens band radio, complete with handles, harrasment, and anonymous threats.”

    And this is still true, but what’s amazing to witness is the organic emergence of social norms on the internet, both informally (people admonishing one another to not engage particularly vicious anonymous debate), and formally (reputation devices built into social media and online markets).

    The really interesting thing about the early adoption of new technologies, and the cause of legitimate fear of them, is that when no one understands a platform or product and traditional social conventions don’t apply to mitigate their use, social defectors self select into the new technological field (think Mortgage Backed Securities) in order to take advantage of the information asymmetry and screw people.

    That process ought to not scare us away from new technologies broadly, as the cost/benefit calculus of the history of technology is astoundingly clear — it delivers progress.

    Like

    1. I get a little nervous whenever the word “progress” pops up (even – especially – if it’s just “progress on my dissertation”), but I guess that’s what I’m talking about in this post, so here goes:

      As your comment suggest, there’s a lot of middle ground between the innovation-transformation boosters and the Luddites. And obviously, most people (including you and me, I assume) occupy that middle ground. I like the word “organic,” which you use. Whenever new technologies (be they digital, industrial, military, economic, governmental, whatever) affect society, I typically prefer the change to occur slowly and organically. Caution diffuses the “legitimate fear[s]” you describe.

      As for the cost/benefit calculus of the history of technology: you’re right, the benefits are often astounding, astronomical (sometimes literally!) and they typically outweigh the costs. But the costs are frequently catastrophic. The technological innovations that have produced the greatest good for the greatest number of people also tend to have a pretty high body count.

      Like

      1. When I talk about technology I’m referring to it in the economic sense of total factor productivity (the new ideas that are left over in explaining economic growth after accounting for inputs of humans and machines). So I always have in mind a chart of long run economic growth when I think of the impact of “technology,” which is an exponential graph (not surprisingly analogous to the graphs I think that guy Kurzweil or whatever found in particular case studies).

        I’m not very moderate when it comes to the beneficence of technology and economic growth. I don’t deny that there are some losers in our march forward, nor that we ought to care about these people (I’ve after all been a giant loser most of my life). But a few displaced workers from technological unemployment or people with identities stolen from early internet commerce don’t, to me, look like a high body count.

        Like

      2. That makes sense. Obviously I’m using the term “technology” colloquially, which unhelpfully expands and limits its definition: it means everything but is hard to use precisely. By “new technology,” I mean “new stuff”: new ideas, new social structures, and (especially) new devices, new tools. I definitely support the profound positive impact of Internet commerce; whatever its negative impact, it’s worth it.

        But when I talk about body counts, I’m not just thinking about unemployed candlemakers (I’m sure you’re not, either). I’m also thinking about the role of technology in warfare, a powerful engine of technology. If you look at something like nuclear fission: there’s no question – even factoring in Hiroshima, Nagasaki, and Chernobyl – that nuclear technology has been, on the whole, a very good thing. But already you have a relatively small body count…relative to the destructive power of otherwise net positive technologies used to kill soldiers and civilians.

        And that’s just warfare: take anything, like mid-20th century agricultural policies that have been, on the whole, positive but that, in the short term, produced deadly famines or troubling surpluses (this in addition to the destroyed livelihoods; I’m not even including that). You could walk through the whole of the Industrial Revolution, step by step, and see how poverty, slavery, exploitation, and various other modes of unnecessary death and suffering were direct and unavoidable byproducts of technologies which were nevertheless profoundly beneficial, which may have saved more lives than they cost. That’s my basic argument: new technologies frequently accrue a body count, but they’re still worth it…but they still accrue that nasty body count. And the technologies that have the greatest capacity for positive construction tend, historically, to have a great capacity for destruction.

        Like

  2. Another reason people seem to be afraid of technology adoption isn’t just that it will damage some people or invite temporary exploitation — but that work ethic itself essentially dictates that working smarter rather than harder is *cheating.* When Betty Crocker came out with boxed cakes, they couldn’t get women to buy them, because women no longer felt like they were cooking. So they included an entirely frivolous egg in the instructions in order to allow women feel the warm proud glow of putting in work for their families.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s