Ten Years After the Future Began
Pages: 1, 2, 3

Some Comments on Security

Integrity and confidentiality are supported quite well with today's Internet standards, but authentication and larger trust issues are not. With authentication, we step outside the circle where protocols and code can solve the problem--that is, where we need more support from the larger world. This additional verification and legal backing is what certificate authorities offer. Credit card purchases over the Internet also require these real-world structures. And even more support has to be put in place if we plan to achieve end-to-end security on the Internet.

Consider this analogy: You are sitting on a public bench when a scruffy fellow pulls up in a van and asks if you'd like some stereo equipment at a low price. Some buyers would be educated enough to know how to open up the equipment and make sure it matched the quality claimed by the seller. The seller, in turn, could hold your cash up to the light to make sure it's not counterfeit. That's the real-world equivalent of integrity on the Internet.

But you're not likely to conclude the deal simply on the basis that the equipment and the cash are both what they appear to be. First, you will justifiably assume that the transaction is illegal because the goods are stolen. Second, you will probably want to pay with a credit card and get a warranty for the goods. For a number of reasons that deal with authentication and trust, you're well advised to say no to the fellow in the van.

In many ways, Internet transactions take place out of the back of a virtual van. We can change that by erecting a complicated verification and authorization system, or adapt to it by using a Web-of-Trust system, and dealing with sites recommended by friends or other authorities. In either case, real-world backing is critical. Furthermore, system designers must always try to protect the right to anonymity outside of transactions and to data privacy.

Quality of Service

RFC 1287 championed the use of the Internet for voice and video. It also recognized that major enhancements to routers would be required to render a packet-based network appropriate for these exciting applications.

For the most part, the relevance of their recommendations inspires admiration ten years later.

The Internet still does not support viable QoS. Multiprotocol Label Switching (MPLS) and IPv6 options for QoS are intriguing but unproven. RSVP (Resource ReSerVation Protocol) is supposed to create low-latency channels, but even its supporters admit it is feasible only if the correspondents share a LAN or another carefully controlled network. (RFC 1287 anticipated this problem when it pointed to routing between autonomous domains as an area for research.) Recently, the original type-of-service byte (TOS) in the IP header has been revisited, and an IETF working group has defined an architecture and standards for using it to provide real service differentiation. Products are appearing, but the services remain to be deployed.

Steve Crocker writes, "QoS requires considerable inter-provider cooperation and standards. There's been a huge land grab and subsequent crash among the providers. We probably have to wait for the dust to settle before we can get any real cooperation." A lot of organizations that want to eliminate jitter and transmission breaks achieve those goals by over-provisioning the communication lines, or by running a lower-layer service with policy routing, such as ATM or MPLS.


RFC 1287 pleaded for advanced applications, and called for more formats to be recognized as standards. MIME has essentially fulfilled this goal, as mail standards designer Einar Stefferud points out. The easy integration of file types on the Web (through MIME or some simpler form of recognition like file suffixes) gave us the basis for rapid advances in graphics, audio, video, and animation that one can find either extremely impressive or highly annoying. The Internet2 community is continuing the quest for more high-bandwidth applications.

Thus, the Internet community has accomplished most of what RFC 1287 requested in the applications area. One intriguing area left to be tied up is the RFC's "Database Access" goal, which calls for "a standard means for accessing databases" and methods that "will tell you about its [the data's] structure and content." Several standards organizations have been developing solutions, often based on SQL. A system that is widely adopted and easy to use is probably not far off, in the form of the W3C's XML Query.

Some Comments on Quality of Service and Multimedia

I have no quarrel with the claim--spoken by planners ranging from the most technical communities to the chambers of corporate management--that voice and video represent the core communications channels for the masses, and that the Internet will flower in ways we cannot even imagine if it manages to support interactive voice and video robustly. Realizing the importance of that vision is much easier than realizing the vision itself.

The Internet's success rests to a large extent on the types of planning and vision that these leaders demonstrated in RFC 1287.

I suspect that packet-switching is not the right delivery channel for large, real-time streams. Using a dedicated circuit to deliver a movie is much easier. Current systems for delivering entertainment are not broken, and there's no reason to fix them. In fact, physical media like CD-ROMs have a lot to recommend them; they're cheap, easy to transport, and durable. To see how responsive this distribution system is, consider Afghanistan, where music and films were banned for five years. Two days after the Taliban left Kabul, CDs and videos were being sold all over town.

But the Internet should evolve to support bursts of high-bandwidth material. We need to make it good enough for reasonably high-quality telephone calls, for interactive video games, and for showing simulations or brief film clips as part of such information-rich communications as distance learning or medical consultations. As for films and music--well, in the new Internet environment, I expect them to evolve into playful, malleable, interactive variations that compete with the traditional media, without trying to displace or ape them.

As audiovisual formats proliferate, Internet researchers recognize the need to tie them together and let them interoperate. Solutions seem to be collecting under the XML umbrella. Synchronized Multimedia Integration Language, Scalable Vector Graphics, and perhaps even Resource Definition Format can be considered steps toward the integration and standardization of new media.

RFC 1287 Didn't Cover Everything

Having covered the major areas addressed in RFC 1287, I will end by stepping back and asking what they failed to anticipate, or what modern issues perhaps simply didn't seem relevant to their discussions. I find their visionary predictions and recommendations quite on target. But hindsight always has the last word.

Everyone knew that corporations would eventually discover the Internet, and that they would demand that it become a more reliable medium. What became obvious only later on is that the commercial interests' notion of reliability included not just up-time and bandwidth, but legal and political policies. These policy issues ranged from global ones like the relationship between domain names and trademarks, to narrow problems like the challenge that Internet gambling presented to traditional casinos.

The growth of commercial involvement paralleled the decline of government involvement. One turning point came with some fanfare in 1995 when the original, government-sponsored Internet backbone (NSFnet) was replaced by one run by the private entity Advanced Network Services (ANS). Significantly, ANS was bought by America Online in 1994.

Internet growth strained the organizations tasked with designing it, and led to the creation of some new organizations. The Internet Society, the World Wide Web Consortium, and ICANN are all attempts to solidify both funding and standards for Internet operation.

The Internet still retains an abstract purity that remains incredibly robust in an age where it is relied upon by millions of people, but it rests upon a nuts-and-bolts, real-world infrastructure that is much more problematic. The cry for higher bandwidth along the last mile has been heard by cable and telecommunications companies, but their offerings fall short along several measures (cost, availability, reliability, upstream bandwidth, and so on).

Worse still, after years of shrill accusations and admonitions, it's clear that high bandwidth is spreading slowly because it's truly hard to do well, and to do at a reasonable cost. The problems can't be totally blamed on manipulative policies by malignant companies. But bold new thinking is required to find a low-cost solution, and that is going to have to come from a totally new quarter. As one example, the successes of spread-spectrum, packet-radio Internet offer a tantalizing promise for the future, and await a recognition by government that new spectrum should be reassigned to this efficient and open medium.

To sum up, the Internet's success outstripped even the predictions of its leaders in 1991. But its success rests to a large extent on the types of planning and vision that these leaders demonstrated in RFC 1287. Obviously, they couldn't anticipate everything. But what's more surprising is that a lot of what they called for has taken more than ten years to put in place. Apparently, the evolution of digital media does not always take place on Internet time.


Thanks to Fred Baker, Nathaniel S. Borenstein, Brian E. Carpenter, Lyman Chapin, Steve Crocker, Jon Crowcroft, Russ Hobby, Harry Hochheiser, John Klensin, Clifford Lynch, and Einar Stefferud for their comments and review. Responsibility for errors and opinions lies with the author.

Andy Oram is an editor for O'Reilly Media, specializing in Linux and free software books, and a member of Computer Professionals for Social Responsibility. His web site is

Return to the O'Reilly Network.