macdevcenter.com
oreilly.comSafari Books Online.Conferences.

advertisement

AddThis Social Bookmark Button

Watching the "Alpha Geeks": OS X and the Next Big Thing

by Tim O'Reilly
05/16/2002

Lunchtime Keynote at the Apple Worldwide Developer Conference, May 8, 2002

Author's Note: The following is not a literal transcript. I speak extempore and typically wander from my script quite a bit. But this is what I wrote up ahead of time as the general drift of what I planned to talk about. Sometimes it's written out fairly completely. In other places, usually where I've addressed the material elsewhere, I've just written brief notes to myself.

Look at Inventing the Future for expansions of some of this material. Parts of this talk were based on that one. And the final section of this talk is a much abbreviated version of the talk on the architecture of Unix and its implications for Web Services that I gave at JavaOne three years ago. I've added some bracketed notes where I know I diverged quite a bit from the script.

This talk was advertised as Tim O'Reilly on OS X. I'm not really going to talk about OS X very directly. I'm going to talk about three things:

1. How you can see the shape of emerging technologies by watching hackers and other "alpha geeks."

This is how we get most of our good ideas at O'Reilly. We look for people who appear to be doing magic, and ask them how they do it. (Remember Arthur C. Clarke's dictum: "Any sufficiently advanced technology appears to be magic.") There are always people in any field who are the most clued in to the deep trends, who seem to be playing with all the coolest stuff, and seem to have their finger in everything before most people even know about it. We get these people to tell us what they do, and we persuade them to write it down, or tell it to someone else who can write it down.

This is how we figure out what books to publish. And it's also why we called our next conference in Santa Clara the Emerging Technology Conference. This year we're focusing on what I'm calling the emergent Internet operating system, but next year, the big news from the alpha geeks may be something else.

(As to why it's important for developers to think about deep trends and about where things are going, I heard a great quote from Ray Kurzweil at a nanotechnology conference a couple of weeks ago -- "I'm an inventor, and that's what made me interested in trend analysis: Inventions need to make sense in the world where you finish a project, not the world in which you start the project.")

2. Lessons from the future.

What is it that I'm actually seeing by watching these guys? I'm going to talk about the big trends that are coming down, and why I think Mac OS X is riding the wave just right.

3. Lessons from the past.

Mitch Kapor, founder of Lotus Development and co-founder of the Electronic Frontier Foundation, once said, "Architecture is politics." Some system architectures are more "hacker friendly" (and thus innovation-friendly) than others. I'm going to talk about some of the characteristics of these architectures, and the lessons you can take from them for your own development.

Watching the Alpha Geeks

If you look at how new technologies come into play, you typically see this sequence:

1. Someone introduces a fundamental breakthrough, a disruptive technology, or business model that will change the nature of the game.

Aside: The term disruptive technology comes from Clayton Christensen's book, The Innovator's Dilemma. He cites two types of innovations: sustaining technologies (cheaper, faster, better versions of existing technologies) and disruptive technologies.

Disruptive technologies are often not "better" when they start out -- in fact, they are often worse. Case in point: the PC. It wasn't better than the mainframe or minicomputer. It was a toy. Similarly, the WWW was far less capable than proprietary CD-ROM hypertext systems, and far less capable than desktop apps. And developers of both derided it as slow, ungainly, and ineffective. This is a typical response to disruptive technologies. Eric Raymond, speaking of open source, quoted Gandhi: "First they laugh at you, then they fight you, then you win."

Disruptive technologies often lead to a paradigm shift. (I know the word "paradigm shift" gets overused. It's a little bit like the knights who say "Ni" in Monty Python and the Holy Grail. I'm going to say "paradigm shift" and it will freeze you in your tracks! Paradigm shift. Paradigm shift.) [In the actual talk, I actually did an extended aside here on Kuhn's Structure of Scientific Revolutions, and the origin of the concept of the paradigm shift.]

But it's true. The full effect of a disruptive technology paradigm shift often takes decades to be felt. There were two paradigm shifts at work in the PC revolution: first, taking the computer out of the glass house and giving it to ordinary people; and second, basing computers on commodity hardware and industry-standard designs.

Related Reading

Mac OS X: The Missing Manual
By David Pogue

There are disruptive business models as well as disruptive tech. IBM's decision to "open source" their design and let other manufacturers copy it was critical to the growth of the market. It's why the Intel-based PC and not the superior Apple Macintosh became the dominant hardware platform today.

Often, disruptive technologies "live underground" for a long time before they're ripe for the paradigm shift to occur. For example, the basic concepts of open source have been around for many years, but they didn't become mainstream until the wide-area computer networking (which to my mind is a key ingredient of the 'secret sauce' behind open source) became widespread.

OK. So we have a disruptive innovation. What happens next?

2. Hackers and "alpha geeks" push the envelope, start to use the new technology, and get more out of their systems long before ordinary users even know what's possible.

Both the Internet and open source were part of a hacker subculture for many years. I got my first email address back in 1978, when the ArpaNet was exactly the kind of "magic" I was talking about earlier. Some people had it. Others didn't. (And in fact, the origins of sendmail, the mail server that still routes the majority of Internet email, were based on exactly this disparity in skills and access. When he was a researcher at UCB, Eric Allman had ArpaNet access, and everyone wanted an account on his machine. He decided it was easier to route mail from the campus network onto the ArpaNet than to manage 700 accounts.)

A good example that's still a bit far out, but that I'm confident is significant. I held a summit of peer-to-peer networking developers, and when we were sitting around having a beer afterwards, a young FreeNet developer said to Kevin Lenzo (who was there because of his early work on IRC infobots): "You sound familiar."

Kevin mentioned that he was the developer of festvox, an open source speech synthesis package, and that he was the source of one of the voices distributed with the package. "Oh, that's why. I listen to you all the time. I pipe IRC to festival so I can listen to it in the background when I'm coding."

Now I'll guarantee that lots of people will routinely be converting text to speech in a few years, and I know it because the hackers are already doing it. It's been possible for a long time, but now it's ripening toward the mainstream.

3. Entrepreneurs create products that simplify what the hackers came up with; there's lots of competition around features, business model, and architecture.

A good example: On the Web, CGI was originally a hack. Then we saw a lot of different systems to improve on the CGI model, and make database-driven Web sites easier for everyone: Cold Fusion, ASP, PHP, JSP.

4. Things get standardized, either by agreement or by someone winning dominant market share.

Comment on this articleWhat trends are you seeing with regards to Mac OS X adoption by cutting edge technologists?
Post your comments

Systems get easier to use by ordinary people, but less satisfying for advanced users. During the standardization process, dominant players put up barriers to entry and try to control the market. Entrepreneurs get acquired or squeezed out. Hackers move on to new areas, looking for "elbow room." Innovation slows down. The cycle repeats itself.

The best platforms know how to find a balance between control and hackability, and the best companies learn how to disrupt themselves before someone else does it to them.

Microsoft gets a lot of heat for not leaving enough on the table for others. My mother, who's English, and quite a character, once said of Bill Gates, "He sounds like someone who would come to your house for dinner and say, 'Thank you. I think I'll have all the mashed potatoes. '"

This isn't quite fair, but it gets the point across, at least about some of Microsoft's behavior. I do think that Microsoft is starting to learn something of the lesson that IBM learned many years ago, how to live with dominant market share without killing off all the outside innovation. I do see signs that they are trying to play better with other people, for example, in the work around SOAP.

Pages: 1, 2

Next Pagearrow