The Web as a Network Services Architecture
There's another implication in what Brian and Roman just showed you that I want to bring to your attention. Like many other applications in the age of the Internet, open source project management is turning into a hosted application.
"Don't think of the Web as a client-server system that simply delivers web pages to web servers. Think of it as a distributed services architecture, with the URL as a first generation "API" for calling those services."
- John Udell
This is where Lessig's point that we need to think about changes in architecture really starts to hit home.
Think for a minute about the most useful new computer applications of the last few years. Few of them are actually applications that you install on your local PC. Instead, they are delivered through the window of your browser: ecommerce applications like amazon.com, eBay or e*trade, or useful information applications like mapquest.
There are enormous implications in this shift for the open source and free software community. For example, I've tried to get Richard Stallman to realize that the GPL loses its teeth in a world where developers no longer need to distribute software in order for users to make use of it. A hosted web application could be built entirely with GPL'd software, and yet have no requirement that the source code be released, since the application itself is never distributed, and distribution is what triggers the GPL's source code availability clause.
But I don't want to spend time here on the implications for licensing. Instead, I want to talk about the implications for that marvelous aspect of the fundamental UNIX design: the pipe, and its ability to connect small independent programs so that they could collectively perform functions beyond the capability of any of them alone.
What is the equivalent of the pipe in the age of the web?
For an answer, I'm going to quote from Jon Udell, the former Byte magazine editor and author of the book Practical Internet Groupware. Jon is one of the most prescient technology observers I know. Several years ago, he turned me on to a concept that has more implications than I can count. This is one of the REALLY BIG IDEAS that is going to shape the next five or ten years of computing.
Jon starts with a simple premise. Don't think of the web as a client-server system that simply delivers web pages to web servers. Think of it as a distributed services architecture, with the URL as a first generation "API" for calling those services.
As Jon said in his keynote for the Zope track at the recent Python conference:
"To a remarkable degree, today's Web already is a vast collection of network services. So far, these services are mainly browser-oriented. My browser "calls" a service on Yahoo to receive a page of a directory. Or it "calls" a service on AltaVista to receive a page of search results.
Jon goes on to describe one such program he wrote, a program he calls the web mindshare calculator. He knows that Altavista provides a links keyword that allows you to return the number of sites that link to a given URL. He knows that Yahoo provides a categorization of related sites. So he wrote a simple perl program that, given a starting point in the Yahoo tree, traverses the tree to its bottom, and feeds the resulting list of URLs, one by one, to Altavista with the links keyword. The output is a list, in descending order, of the most linked-to sites in any given Yahoo category.
As he says, this is the Web's analog to the UNIX pipeline. It allows you to use two websites in a way that their creators didn't quite intend, but which extends them and makes them more useful.