2002: The Carpetbaggers Go Homeby Cory Doctorow
The Internet is antithetical to commerce. There, I said it.
Business is built around reliability, offering a predictable quality of service from transaction to transaction. Even the messiest, one-off businesses are based on reliability; for example, estate auctioneers are predictable -- indeed, they provide the only touchstone of predictability in one-off sales, through the authorship of dependably consistent auction catalogs.
The Internet is unpredictable. It's non-goddamned-deterministic. That's the point, of course. The Internet is a lingua franca (or, if you believe the poor bastards who went broke on Frame Relay networks, a lowest-common-denominator) by which just about any computer can talk to just about any other computer. Your computer only needs to know how to construct its message such that it conforms -- more or less -- to the latest rev of the holy book of Standards, slap an IP address on the ass-end of it, and ship it out to the intended recipient, who interprets it according to that same holy writ.
That's the thing about messages -- the recipients can do anything they want with them, not merely what you expect them to do. For example, Raffi emailed me a couple of weeks ago with the wonderful idea of identifying cataclysmic, 9-11-grade crises by monitoring the lagginess of CNN's Web server, on the very sensible grounds that when bad stuff happens, the whole world fires up its browser and points it at cnn.com, which gags and barfs when it is confronted with a half-billion twitchy netizens trying to get the scoop on the latest thrax scare.
Raffi doesn't care what's on CNN's Web page -- he cares about how long it takes CNN's httpd to respond to a generic query. Like a jazz critic who listens to the pauses more than the notes, Raffi will toss away the message sent by CNN's httpd and record only the interval associated with it.
God, this kind of stuff drives most MBAs berserk. eBay's devoting fantastic heaps of engineering talent to make sure that their pages can't be scraped by homebrew Web-spiders that attempt to aggregate eBay's auction listings with Amazon's (and, of course, O'Reilly does the same thing with Safari). To both organizations' credit, neither has attempted to Java-fy their UI/UE as a means of controlling the sorts of things their users can do with their messages.
The Internet is loose and wobbly from the bottom up. TCP/IP is all about non-deterministic routing: Packet A and Packet A-prime may take completely different routes (over transports as varied as twisted pair, co-ax, fiber, sat, and RF) to reach the same destination. When the Internet is running tickety-boo, the traffic histogram at any point is positively Brownian, fuzzy and random and bunchy and uncoordinated as a swarm of ants randomwalking through your kitchen.
Fuzzy at the bottom: TCP/IP. Fuzzy in the middle: message-passing protocols. Fuzzy on top: services.
The Internet is full of fantastically useful and frustratingly unavailable services, from the elegant simplicty of Weblogs.com's XML-RPC interface that accepts a URL and a link-title and shoves 'em on top of the stack of recently updated sites, to the unaffiliated public CVS servers that pock the Internet like so much acne. They work well enough, on average, and if they were all to fail suddenly and at once, the Internet would kind of suck until they came back online. But there are enough of these little tools, enough ways of finding and manipulating information, that users can interpret unreliability as damage and route around it, finding alternate means of communicating and being communicated at.
This nondeterminacy is antithetical to all our traditional notions about success in branding and business. The CEOs of the world have a lot of trouble getting their heads around it:
The music industry wasn't always sure that Napster was their doom. When Napster debuted, bigwigs at conferences like DDMI pooh-poohed the service. No one who's serious about music will use Napster, it's too unreliable. Napster's users will surely quit in disgust when they discover that the availability of any given song is based on whether some other Napster user who has the song has left their computer on and Napster running. If that doesn't scare them off, then badly-ripped files with misspelled, inaccurate metadata and files truncated en route will.
And these all were problems with Napster! Nothing's worse than downloading three quarters of a file and have the guy you're downloading from switch off his machine! But it didn't matter. It was good enough -- empirically so. Napster is the fastest-adopted technology in human history.
A sane, managed approach to building a reliable Napster would involve armies of trained employees, laboring around the clock to digitize, annotate, post, and catalog mountains of CDs. The files would live on a server-cluster the size of an ENIAC outbuilding with a braided rope of fiber as thick as a baby's arm punched up through the raised floor.
It would cost a fortune. By the time the industry was ready to sell you an MP3 for download, they'd have to bill you $22.95 for it.
The point is that Napster -- and its anarchic, unmanaged successors -- was almost reliable enough to build a business on, just one little last-mile away from brand-consistent perfection.
The Intractable Percentile
The first time many corporate silverbacks saw the Internet, they came to the conclusion that the way to commercialize the thing was to carve out managed pockets of sanity in the anarchy. Consumers have been bred to expect consistency, cultivated for it by generations of Madison Avenue Mafiosi, and changing the expectations of consumers back to unbranded chaos is not an option at this late date -- it'd be like trying to breed chihuahuas back into wolves. Instead, you change the environment to meet consumer expectation, sit back, and open the checks.
After all, it's just a little order imposed in the chaos, right? How hard can it be?
This is one of the classic fallacies that players of the Big Con like to confuse their marks with. If A is possible, and B is possible, then how hard can it be to make A and B? Eric Benson soaked up a lot of people's money with a scam that exploited this. Benson had a prototype child-finder bracelet that used satellite-based positioning signals to track abducted, lost, and missing tots, then beamed their locations back to the satellite so they could be recovered.
Benson's plan sounded reasonable. He could show you a working, miniature GPS receiver that could pinpoint its location to the yard from anywhere on earth with a clear shot at the sky. Everyone knows that GPS sats communicate with GPS devices to accomplish this quotidian miracle. It's therefore reasonable to believe that such a device can talk back to the satellite and give its position back.
Of course, this is a lie. Satellite receivers can be small and low-powered -- the satellites are sending their signals with enough power to be heard by anemic little dashboard navigation systems. Uplinking to a satellite, on the other hand, requires a device with all the power-consumption, safety, and user-friendliness of an industrial microwave oven without a door. Benson's child-finder bracelets couldn't work the way he said they would -- at best, they'd allow lost children to know exactly how lost they were.
The people that Benson rooked were making the same kinds of semi-reasonable assumptions that the Mighty Morphin Power Brokers of the New Economy made when they assumed, for example, that a crappy audio stream could be made radio-grade with just a little engineering know-how and some strategic alliances.
After all, if something's 95 percent of the way to perfection, how hard can it be to nail that last five percent?
Engineers understand that the difference between 95 percent and 96 percent reliability is often infinite. Some problems are tractable within a certain tolerance, and asymptotic to infinity above that tolerance. The Internet is full of best-effort algorithms, timeout mechanisms, asynchronous communication, and stateless clients. This is the fault-tolerant realpolitik of a public network composed of uncoordinated actors with conflicting agendas.
You can't be a control-freak on a public network. KPMG learned this lesson the hard way. The big-name Internet
carpetbagger consultants have a policy that requires anyone wishing to link to their site to secure a formal, legally binding agreement in advance -- just as anyone using KPMG's trademarks would have to negotiate that right first. When KPMG sent a threatening note to Chris Raettig, an itinerant British engineer, demanding that he produce evidence of his formal agreement granting him permission to link to their site, Raettig countered by sending back a snippy note and posting both to his Web site. Within days, KPMG had become a laughingstock, their brand ("KPMG is the global network of professional advisory firms whose aim is to turn knowledge into value for the benefit of its clients, its people and communities.") flushed down the toilet.
Pages: 1, 2