MacDevCenter    
 Published on MacDevCenter (http://www.macdevcenter.com/)
 See this if you're having trouble printing code examples


Watching the "Alpha Geeks": OS X and the Next Big Thing

by Tim O'Reilly
05/16/2002

Lunchtime Keynote at the Apple Worldwide Developer Conference, May 8, 2002

Author's Note: The following is not a literal transcript. I speak extempore and typically wander from my script quite a bit. But this is what I wrote up ahead of time as the general drift of what I planned to talk about. Sometimes it's written out fairly completely. In other places, usually where I've addressed the material elsewhere, I've just written brief notes to myself.

Look at Inventing the Future for expansions of some of this material. Parts of this talk were based on that one. And the final section of this talk is a much abbreviated version of the talk on the architecture of Unix and its implications for Web Services that I gave at JavaOne three years ago. I've added some bracketed notes where I know I diverged quite a bit from the script.

This talk was advertised as Tim O'Reilly on OS X. I'm not really going to talk about OS X very directly. I'm going to talk about three things:

1. How you can see the shape of emerging technologies by watching hackers and other "alpha geeks."

This is how we get most of our good ideas at O'Reilly. We look for people who appear to be doing magic, and ask them how they do it. (Remember Arthur C. Clarke's dictum: "Any sufficiently advanced technology appears to be magic.") There are always people in any field who are the most clued in to the deep trends, who seem to be playing with all the coolest stuff, and seem to have their finger in everything before most people even know about it. We get these people to tell us what they do, and we persuade them to write it down, or tell it to someone else who can write it down.

This is how we figure out what books to publish. And it's also why we called our next conference in Santa Clara the Emerging Technology Conference. This year we're focusing on what I'm calling the emergent Internet operating system, but next year, the big news from the alpha geeks may be something else.

(As to why it's important for developers to think about deep trends and about where things are going, I heard a great quote from Ray Kurzweil at a nanotechnology conference a couple of weeks ago -- "I'm an inventor, and that's what made me interested in trend analysis: Inventions need to make sense in the world where you finish a project, not the world in which you start the project.")

2. Lessons from the future.

What is it that I'm actually seeing by watching these guys? I'm going to talk about the big trends that are coming down, and why I think Mac OS X is riding the wave just right.

3. Lessons from the past.

Mitch Kapor, founder of Lotus Development and co-founder of the Electronic Frontier Foundation, once said, "Architecture is politics." Some system architectures are more "hacker friendly" (and thus innovation-friendly) than others. I'm going to talk about some of the characteristics of these architectures, and the lessons you can take from them for your own development.

Watching the Alpha Geeks

If you look at how new technologies come into play, you typically see this sequence:

1. Someone introduces a fundamental breakthrough, a disruptive technology, or business model that will change the nature of the game.

Aside: The term disruptive technology comes from Clayton Christensen's book, The Innovator's Dilemma. He cites two types of innovations: sustaining technologies (cheaper, faster, better versions of existing technologies) and disruptive technologies.

Disruptive technologies are often not "better" when they start out -- in fact, they are often worse. Case in point: the PC. It wasn't better than the mainframe or minicomputer. It was a toy. Similarly, the WWW was far less capable than proprietary CD-ROM hypertext systems, and far less capable than desktop apps. And developers of both derided it as slow, ungainly, and ineffective. This is a typical response to disruptive technologies. Eric Raymond, speaking of open source, quoted Gandhi: "First they laugh at you, then they fight you, then you win."

Disruptive technologies often lead to a paradigm shift. (I know the word "paradigm shift" gets overused. It's a little bit like the knights who say "Ni" in Monty Python and the Holy Grail. I'm going to say "paradigm shift" and it will freeze you in your tracks! Paradigm shift. Paradigm shift.) [In the actual talk, I actually did an extended aside here on Kuhn's Structure of Scientific Revolutions, and the origin of the concept of the paradigm shift.]

But it's true. The full effect of a disruptive technology paradigm shift often takes decades to be felt. There were two paradigm shifts at work in the PC revolution: first, taking the computer out of the glass house and giving it to ordinary people; and second, basing computers on commodity hardware and industry-standard designs.

Related Reading

Mac OS X: The Missing Manual
By David Pogue

There are disruptive business models as well as disruptive tech. IBM's decision to "open source" their design and let other manufacturers copy it was critical to the growth of the market. It's why the Intel-based PC and not the superior Apple Macintosh became the dominant hardware platform today.

Often, disruptive technologies "live underground" for a long time before they're ripe for the paradigm shift to occur. For example, the basic concepts of open source have been around for many years, but they didn't become mainstream until the wide-area computer networking (which to my mind is a key ingredient of the 'secret sauce' behind open source) became widespread.

OK. So we have a disruptive innovation. What happens next?

2. Hackers and "alpha geeks" push the envelope, start to use the new technology, and get more out of their systems long before ordinary users even know what's possible.

Both the Internet and open source were part of a hacker subculture for many years. I got my first email address back in 1978, when the ArpaNet was exactly the kind of "magic" I was talking about earlier. Some people had it. Others didn't. (And in fact, the origins of sendmail, the mail server that still routes the majority of Internet email, were based on exactly this disparity in skills and access. When he was a researcher at UCB, Eric Allman had ArpaNet access, and everyone wanted an account on his machine. He decided it was easier to route mail from the campus network onto the ArpaNet than to manage 700 accounts.)

A good example that's still a bit far out, but that I'm confident is significant. I held a summit of peer-to-peer networking developers, and when we were sitting around having a beer afterwards, a young FreeNet developer said to Kevin Lenzo (who was there because of his early work on IRC infobots): "You sound familiar."

Kevin mentioned that he was the developer of festvox, an open source speech synthesis package, and that he was the source of one of the voices distributed with the package. "Oh, that's why. I listen to you all the time. I pipe IRC to festival so I can listen to it in the background when I'm coding."

Now I'll guarantee that lots of people will routinely be converting text to speech in a few years, and I know it because the hackers are already doing it. It's been possible for a long time, but now it's ripening toward the mainstream.

3. Entrepreneurs create products that simplify what the hackers came up with; there's lots of competition around features, business model, and architecture.

A good example: On the Web, CGI was originally a hack. Then we saw a lot of different systems to improve on the CGI model, and make database-driven Web sites easier for everyone: Cold Fusion, ASP, PHP, JSP.

4. Things get standardized, either by agreement or by someone winning dominant market share.

Comment on this articleWhat trends are you seeing with regards to Mac OS X adoption by cutting edge technologists?
Post your comments

Systems get easier to use by ordinary people, but less satisfying for advanced users. During the standardization process, dominant players put up barriers to entry and try to control the market. Entrepreneurs get acquired or squeezed out. Hackers move on to new areas, looking for "elbow room." Innovation slows down. The cycle repeats itself.

The best platforms know how to find a balance between control and hackability, and the best companies learn how to disrupt themselves before someone else does it to them.

Microsoft gets a lot of heat for not leaving enough on the table for others. My mother, who's English, and quite a character, once said of Bill Gates, "He sounds like someone who would come to your house for dinner and say, 'Thank you. I think I'll have all the mashed potatoes. '"

This isn't quite fair, but it gets the point across, at least about some of Microsoft's behavior. I do think that Microsoft is starting to learn something of the lesson that IBM learned many years ago, how to live with dominant market share without killing off all the outside innovation. I do see signs that they are trying to play better with other people, for example, in the work around SOAP.

Lessons from the Future -- What Are the Alpha Geeks Telling us Right Now?

1. They are choosing Mac OS X in overwhelming numbers!

There have been an amazing number of iBooks at recent O'Reilly conferences. The adoption by key OSS communities and leaders is also striking. For example: most of the Perl core team is now on OS X; James Gosling, Duncan Davidson, and a lot of other key Java developers; P2P developers; many of the key developers in bioinformatics (a very important new field involving the application of computer power to gene research and related areas). All are heavily into OS X.

And of course, it's not just hackers, but users who are taking up OS X in droves. Mac OS X: The Missing Manual is our fastest-selling new book since The Whole Internet User's Guide in 1992-1993, when the commercial Internet started to take off.

Why?

And there are a lot of things even more appealing about Jaguar (Mac OS X 10.2). I'll talk about those a bit more in a moment, because the fact that OS X is so in touch with what's coming down the pike is what's making it the platform of choice for today's hacker.

What are the hackers telling us beyond the fact that they like OS X?

2. Assume network connectivity is central; don't treat it as an add-on. Also assume cheap local storage.

Discuss MP3.com vs. Napster [See "Inventing the Future" for a description of what I talked about here.]

Bob Morris, vice president at IBM Almaden, and keynoter at next week's conference, pointed out that storage is getting cheaper even faster than ad hoc peer-to-peer networking is a key part of the new paradigm. Assume people are on, redundant resources come and go, and find ways to navigate that paradigm. It's really exciting to see that Apple totally gets this, with Rendezvous and the new features in iTunes under Jaguar. It's also so impressive that Steve Jobs is willing to stand up to the copyright bullies and argue for the disruptive paradigm.

3. 802.11b (WiFi) networking is a must-have, not a nice-to-have.

Some of my favorite hacker exploits are in the 802.11 space right now. Hackers are wiring their own communities. They are extending the range of 802.11b networks with homemade antennas, made from coffee cans. How many of you saw the antenna shootout reported on oreillynet.com?

(Aside: Don't you hate the name WiFi? I prefer Rob Flickenger's nocat. Rob is the author of our book Building Wireless Community Networks and the co-founder of the community networking group at www.nocat.net.) You might think the term comes from "no CAT5 cabling" but in fact, it comes from a great story about Einstein.

When asked to describe radio, he said: "You see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat."

This may be an apocryphal story, but if it isn't true, it ought to be.

And of course, Apple is already all over this. 802.11 is a key part of why hackers like Mac laptops. The fact that it's hidden inside the case just makes it a natural part of today's laptop, rather than an add-on. And features like the integration of wireless with Rendezvous and iTunes are great.

4. Chat matters.

Long derided as a toy, chat is becoming one of the key new apps -- and what's more, key platform functionality. It's not just chat between people, but chat between programs. As described in Programming Jabber, DJ Adams has been using Jabber, the XML-based open source IM framework, to control SAP from a cell phone, using Jabber IM as the transport. Other folks are writing IM bots that answer questions and interact with users.

It's very impressive that Apple was able to make a deal with AOL to have iChat AIM compatible. AOL has been missing a major opportunity associated with their IM software. They are making the same mistake Lotus made with spreadsheets. Their IM applications have dominant market share, but they are vulnerable because they are only applications. As Microsoft has demonstrated again and again, it's possible to integrate applications into a broader platform by offering aspects of those applications to developers as program-callable functionality in a set of system-level APIs. (In fact, that's the lesson of all these technologies -- make them available for developers to build on, not just self-contained apps.)

5. Web Services are already a reality.

Hackers are building unauthorized interfaces via Web spidering and screen scraping. They think of Web-facing databases as software components to be recombined in new ways, and with new interfaces. We've built a whole custom set of market research interfaces to Amazon by spidering every computer book on Amazon every three hours, collecting vital statistics like author, publisher, page count, price, sales rank, and number and nature of customer reviews.

Another developer I know built a carpooling app based on MapQuest. An employee can say, "Who lives near me or on my route to work?" and get back a list of other employees on that route. These are brute force hacks that show the kinds of services that could be done much more efficiently if the Web sites in question provided suitable programmer interfaces via SOAP or XML-RPC.

[Look at Inventing the Future for a more detailed write-up of this point. I didn't write it up for myself here because I've already given this talk in detail at other times.]

These four things -- ad-hoc peer-to-peer networking, or Rendezvous (as Apple now calls it); nocat wireless networking; chat as transport; and Web sites as software components, or Web Services -- all come together (along with a couple of other things, such as grid computing) into what I call the "emergent Internet operating system." (That's the subject of my keynote at our conference in Santa Clara: The Shape of Things to Come.) We really are building a next generation operating system for the Net. The question is what kind of operating system we want it to be. I'm doing this bit of shameless self-promotion because it's really important that we think about this question.

I said earlier that after the hackers push the envelope, entrepreneurs start to simplify it. Hackers have been doing the kind of things that we see in OS X for a while. But they've had to do them the hard way. Apple has started to make it easy, so ordinary people can enjoy the benefits.

A good example is Apple's integration of Yahoo Finance and MapQuest right into Sherlock, rather than just through the browser. Perl hackers have been able to do this kind of integration for years, but ordinary people couldn't enjoy it. One of the challenges, though, is not just to integrate these things into an application other than the browser, but to expose APIs so that all developers can work with them. Data-driven Web sites need to be seen as software components. Google's API is a good step in the right direction, but all data-driven Web sites need to start thinking of themselves as program-callable software components for developers.

Marc Andreesen once dismissively referred to Windows as "just a bag of drivers," but in fact, that was its genius. The Win32 API took some of the pain out of supporting devices, so that application developers didn't have to do lots of one-off interfaces. And one of the challenges of the "Internet OS" is to figure out the "bag of drivers" we need now. What are the services that we want to integrate that developers want to use? I believe that Web databases are part of what we need standard "drivers" for.

That's ultimately what Microsoft's .Net is about. Defining a standard set of Net-facing programming services. And Apple is showing that they have a lot of the same moxie. Except that they don't have the clout to believe they can own the whole thing, so instead they are trying to inter-operate with the functionality that is out there. That's one of the most exciting things I've heard about Jaguar: the integration of services from AOL (AIM and MapQuest) and Yahoo. Which is absolutely the right thing to do.

What's particularly interesting here is also the way that a non-controlling player has to do "integration into the OS." Microsoft has typically integrated functionality that replaces or competes with the work of some existing player (Netscape, Real Networks, AOL), while Apple is having to find ways to integrate support for the existing players.

And of course, if the OS is an Internet OS rather than a PC OS, then the PC OSs are themselves software components of that larger OS. Apple seems to understand that.

Related Reading

Learning Unix for Mac OS X
By Dave Taylor, Jerry Peek

So that brings me to the third and final part of my talk. If we're building an Internet OS, what kind of OS would we like it to be?

Lessons from the Past

That's where we can learn a lot from the design of Unix and the Internet. When I'm talking about design, I'm talking at a very high level, the "architecture is politics" level. I've recently become very aware of this level since reading Larry Lessig's book, Code and Other Laws of Cyberspace.

At some level, both Unix and the Internet are loosely coupled protocol-centric systems rather than API-centric systems.

I've always loved the formulation given in Kernighan and Pike's book about Unix, The Unix Programming Environment. If you're starting to work with Unix, you should go buy it, even though it was published by AW! It describes the few simple rules that made Unix work.

Everyone talks ASCII as a common data format. Write standard output and read standard input. Write small programs that work with other programs in a pipeline. I call this an architecture of participation. It made it possible for individuals all over the world to write programs without any central coordination. More than licensing, I think that this architecture is behind the success of Unix, and ultimately Linux.

Similarly, the Internet was based on what Jon Postel, in the RFC for TCP, called "the robustness principle": Be liberal in what you accept, rigorous in what you send out. In other words, try to be interoperable. Contrast this with what happened when Netscape and Microsoft started fighting over control of HTML and what happened in the browser. We saw the exact opposite, where each was being extremely illiberal in what they accepted, and putting out all kinds of proprietary stuff that no one else knew how to render.

The Internet architecture has flourished for 25+ years because of the robustness principle, and because it has a loosely coupled architecture in which the intelligence is in the endpoints, not in some central registry. Loosely connected actors, low barriers to entry ... these matter.

The result was explained really well in a recent message that Mike O'Dell, the former CTO of UUNET, the first commercial Internet service provider, sent to Dave Farber's IP list, regarding the spread of community wireless networks:

Biology will out-innovate the centralized planning "Department of Sanctified New Ideas" approach every time if given half a chance. Once the law of large numbers kicks in, even in the presence of Sturgeon's Law ("90 percent of *everything* is crap"), enough semi-good things happen that the progeny are at least interesting and some are quite vigorous.

"Nature works by conducting a zillion experiments in parallel and *most* of them die, but enough survive to keep the game interesting." (The "most of them die" part was, alas, overlooked by rabid investors in the latter '90s).

This is how the commercial Internet took off and this remains the central value of that network.

You don't need permission to innovate.

You get an idea and you can try it quickly. If it fails, fine -- one more bad idea to not reinvent later. But if it works, it takes off and can spread like wildfire. Instant messaging is a good example, even though ultimately complicated by ego and hubris. Various "peer-to-peer" things is an even better one.

WiFi is evolving the same way. A bit of *great* enabling technology made possible by a fortuitous policy accident in years long past, a few remarkable hacks, a huge perceived-value proposition (and I don't mean "free beer"), and presto! You have a real party going on in the Petri dish.

Clarifying note: During my tenure at UUNET, I described the real business as operating a giant Petri dish -- we kept it warm, we pumped in nutrients, and we made it bigger when it filled up. And people paid us money to sit in the dish and see what happened.

And standards compliance is a key part of the magic. I sometimes think that the IETF (the Internet Engineering Task Force, the Internet standards body responsible for such standards as TCP/IP) is one of the crown jewels in all of western civilization. For all its flaws, it's demonstrated that we can develop effective standards bottom up without a lot of central control. Again, it's made possible by a few simple rules: you can't have a paper standard, you have to have implementations, and since it's about interoperability, you have to have two independent implementations. This is summed up in Dave Clark's famous line, "No kings, voting or presidents, just a rough consensus and running code."

The other thing is that "small is beautiful". Modular designs. Apache and Jakarta, Perl and CPAN. Give people a way to extend the base program, don't just build in all the functionality and close the case.

And of course, both Unix and the Internet have adopted a kind of modular documentation. Pay as you go, write a man page or an RFC. Don't wait for the massive manual or the O'Reilly book.

Of course, we can also learn a lot from the design of earlier versions of Mac OS, especially the importance of consistent interfaces.

Again, Apple seems to be on the right track with OS X. One of the things that made me most excited about iPhoto, for instance, was that Apple has built a framework for users and developers to add extensions.

In Closing

Mac OS X is a great platform. It's building the future into the system, in terms of the technology choices it's making. It's building on an open, extensible framework in the form of Darwin and FreeBSD. It's learning lessons from the open source community.

Now, as developers, you have to do the same thing. Think network. Think open. Think extensible. Play well with others.

Tim O'Reilly is the founder and CEO of O’Reilly Media Inc. Considered by many to be the best computer book publisher in the world, O'Reilly Media also hosts conferences on technology topics, including the O'Reilly Open Source Convention, Strata: The Business of Data, the Velocity Conference on Web Performance and Operations, and many others. Tim's blog, the O'Reilly Radar "watches the alpha geeks" to determine emerging technology trends, and serves as a platform for advocacy about issues of importance to the technical community. Tim is also a partner at O'Reilly AlphaTech Ventures, O'Reilly's early stage venture firm, and is on the board of Safari Books Online, PeerJ, Code for America, and Maker Media, which was recently spun out from O'Reilly Media. Maker Media's Maker Faire has been compared to the West Coast Computer Faire, which launched the personal computer revolution.


Return to MacDevCenter.com.

Copyright © 2009 O'Reilly Media, Inc.