The Innovator’s Dilemma

After reading Disrupting Class and several articles about disruptive technology on the asymco blog, I decided I should go to the source and read The Innovator’s Dilemma by Clayton M. Christensen, published in 2000. It’s one of those books that seems fairly obvious in retrospect — now that ten years have passed and its lessons have largely been absorbed into business practice and culture.

The book is based on Christensen’s PhD thesis, which originally looked at technology and business trends in the hard disk drive industry. He found that some technologies (such as improved read-write heads) served to “sustain” existing product lines and cement the dominance of existing companies, while other technologies (such as smaller form factors) ended up “disrupting” existing products to the extent that once-dominant companies sometimes went out of business in just a few years.

The reason these companies failed was not that they were poorly managed, but because the disruptive products were in completely separate markets (and accompanying “value networks”). The existing companies were simply not designed to compete in those new markets. For example, 5-inch drives were sold to minicomputer makers, while 3.5-inch drives were sold to personal computer makers (with shorter design cycles, higher volumes, and lower profit margins). The existing minicomputer customers had no need for 3.5-inch drives, so the 5-inch manufacturers saw no market and no need to produce them until it was too late and other startup companies were already dominating the emerging market for personal computer hard drives (3.5-inch).

In other words, the businesses of making and selling 5-inch versus 3.5-inch drives were so different that being the dominant expert in hard drive technology was not actually much of an advantage. In fact, it was a disadvantage because the whole organization was designed to compete in the old business and naturally fought attempts to undercut that business.

But how do you know if a given product idea is going to be disruptive?

One clue: disruptive products are usually simpler, less powerful, and have smaller profit margins than existing products. So they need to find markets that value product attributes like convenience, reliability, and ease of use over sheer power. For example, business accounting software in the nineties was driven by the needs of large enterprise customers and so was quite complex and powerful. Quicken disrupted this market by creating a simpler, cheaper product based on its personal finance software. This was so much easier to use that it quickly gained an 80% market share among small business owners who did not need all those extra features.

What makes technologies “disruptive” rather than just “niche” is when they progress far enough to compete up-market with existing product lines. For example, Quicken continued to add features so that larger and larger businesses were able to use its software, pushing out the old software companies to only serve the largest enterprise customers. Potential disruptive technologies should have a plausible development plan that will eventually displace existing products up-market.

The big take-aways are:

1. If you want to start a new company, do it with a product idea that is likely to be disruptive. Otherwise, you have very little chance of making any headway against existing players.

2. Generally the only way to manage disruptive technologies from within an existing company is to create a totally separate organization with the sole purpose of going after that disruptive technology. If you don’t keep it separate enough, resources will inevitably be borrowed to take care of existing business and the new products will languish.

Apple has a better record than most for its ability to disrupt its own products before competitors get the chance. Horace Dediu makes a good argument that the iPhone should be seen not as “a better phone” but as a disruptive technology for personal computers: a simpler and more convenient way to accomplish computing tasks such as email and web surfing. The inclusion of a phone capability just makes it all the more convenient. I know at least one person who decided to get an iPhone instead of a new laptop; and Apple’s iPad is even more competitive with laptop computers. iPhones and iPads will continue to “move up-market” by adding the ability to conveniently handle ever more computing tasks. As this happens, Macs and other desktop PCs will increasingly be seen as high-end tools for power users.

2001: Space Art

I just watched 2001: A Space Odyssey, mostly with the goal of better understanding nerd cultural references. I hadn’t realized until I looked at the DVD jacket that it was released way back in 1968, shortly before the first real-life moon landing in 1969.

I assume (and skim from wikipedia) that 2001 is legendary for its pioneering special effects (such as simulated zero-gravity environments and spaceship fly-bys) and the philosophical and scientific questions it raises. I’m not going to try to dispute its status as a work of genius. I remember enjoying the book version when I read it many years ago.

But of course, by this point in history, artificial intelligence has been thoroughly discussed, and the astronomical cost of space travel makes the lavish and enormous spacecraft in the movie seem absurd (for example, the jupiter-bound ship is way bigger than necessary for supporting a mere six crew members).

And it seemed to me that the parts of the film which actually moved the plot forward could have been condensed down to about 15 minutes. The rest is better interpreted as space art, to be enjoyed at leisure in a gallery while pondering the nature of humanity.

All of this is to say that I found the movie to be extraordinarily boring.

But at least I’m one step closer to understanding what the heck my co-workers are talking about…

Practical people

“What is the point of having discussions with practical people who always say you cannot change the world?”

-Theodore Zeldin

Dramatic photo


I took this photo from the Queen Anne neighborhood in Seattle (walking distance from my office), looking southwest towards Elliot Bay.

Camera: iPhone 4.

Post-processing: Digitally removed power lines via Photoshop.

The enemy of truth

“Truth… depends on evidence. Without evidence, anything goes. The enemy of truth is very often not the lie, but the myth.”

-Harold Kroto

Collective intelligence depends on social skills

Researchers from MIT and elsewhere recently published a study where groups of two to five people had to solve various problems such as “visual puzzles… negotiations, brainstorming, games and complex rule-based design assignments.”

They found that “the average and maximum intelligence of individual group members did not significantly predict the performance of their groups overall.” However:

Groups whose members had higher levels of “social sensitivity” were more collectively intelligent [i.e. those groups had better scores on the problems they solved together]. “Social sensitivity has to do with how well group members perceive each other’s emotions,” says Christopher Chabris, a co-author.

The study was billed as a way for managers to form better teams. But the more important point to me is: social intelligence is critical in business. When students enter the workforce without well-honed social skills, the teams they’re a part of are less effective and make worse decisions.

As another of the study’s co-authors said, “What individuals can do all by themselves is becoming less important; what matters more is what they can do with others and by using technology.” If this is true, effective schools will need to prioritize social intelligence in the curriculum.

BASIC was designed for students

I’m not sure when I put this article about CS education reform into my Instapaper, but I learned something new:

In the early 1960s, the professors John Kemeny and Thomas Kurtz developed BASIC (Beginner’s All-purpose Symbolic Instruction Code) at Dartmouth College because they thought educated people, and future leaders of America, should have some first-hand experience with computing.

I like this precedent of developing tools for students first, and business markets later. Focusing on students (and particularly “100-level” classes) is a great motivation to keep things simple and easy to learn. When/if the technology eventually gets powerful enough to compete with other tools, it will win because real people might actually want to use it.

Bill Buxton strikes again

A few weeks ago, I saw Bill Buxton give a talk at UW for the Puget Sound SIGCHI meeting.

The main takeaway for me was Buxton’s call to study and learn from the history of design. “Know the good parts of history and what to cherry pick from it.” For example, the original set of five-color iPods took many design cues from an early consumer camera product line. “[Apple design chief] Jonathan Ive knows the history.”

Buxton showed photos of what he called the first educational technology device: the PLATO IV from 1972. It included graphical output and a touch screen, and was apparently put in schools all over Illinois. The similarities to the iPad are striking. He demoed a watch from 1984 that includes a touch surface capable of doing character recognition on decimal digits. It sold for just $145 (in today’s dollars). Buxton also took a look at the first real smartphone: the “Simon” from 1993. It is shockingly similar to the iPhone, complete with a row of app icons on the home screen. The only app “missing” is a web browser (the html web was still a research novelty in 1993).

There were many other examples which I didn’t note specifically, many of them MIT Media Lab prototypes published in SIGGRAPH. Buxton also pointed the audience to archive.org for more, such as a video on input devices in 1988.

The second takeaway was Buxton’s theory of the “long nose”: it takes about 20 years to go from an idea to a $1 billion industry. In other words, “Any technology that is going to have significant impact over the next 10 years is already at least 10 years old.” So the important act of innovation is not the “lightbulb in the head” but rather picking out the correct older ideas that haven’t yet hit the elbow of the exponential curve. When change is slow, humans don’t tend to notice; but you can counteract that by explicitly measuring the change as technology progresses. What are the technologies invented 20 years ago that are about to become huge?

Still Magical

As part of testing our upcoming iOS 4.2 release of OmniGraphSketcher for iPad, I just threw together this graph — a more or less exact replica of a textbook economics diagram.  All on the iPad, without any fuss.

Economics diagram f on OmniGraphSketcher for iPad

I could email the PDF directly to a textbook publisher.

Despite the fact that I’ve been working on this app ever since the iPad was announced, the whole thing still kind of boggles my mind. Even though I know in detail how the app works, Apple’s term “magical” is the best way I know of to describe the experience of using it.