Apps are the next medium

In recent months, App Store “apps” have continued to explode in popularity. iOS apps are currently being downloaded at twice the rate of digital music tracks from the iTunes store, with the average iOS user downloading five new apps per month. App developers have received $2 billion in revenue over the first 31 months (and that does not include ad revenue).

Horace Dediu recently made the important point that not only are apps here to stay, they are the next important medium. Apple designed iOS such that apps take over the whole device, each one “magically” transforming it into an object with a whole new set of capabilities. The device becomes a map or a compass or a video camera or a textbook or a game or a credit card or a newspaper or a radio or any new combination thereof. This new medium can be used for both art and productivity; charity and industry; form and function. It is not a fad. It is rather the logical next step for human-facing software.

Like music and movies, apps can be bought and sold as if they’re objects. The iTunes store already sells more music than any other physical or online retailer. The App Store is poised to become the de facto provider of consumer device functionality.

As this medium gains traction and more artisans learn how to create apps, I think we will find that we’ve only just scratched the surface of what is possible.

BASIC was designed for students

I’m not sure when I put this article about CS education reform into my Instapaper, but I learned something new:

In the early 1960s, the professors John Kemeny and Thomas Kurtz developed BASIC (Beginner’s All-purpose Symbolic Instruction Code) at Dartmouth College because they thought educated people, and future leaders of America, should have some first-hand experience with computing.

I like this precedent of developing tools for students first, and business markets later. Focusing on students (and particularly “100-level” classes) is a great motivation to keep things simple and easy to learn. When/if the technology eventually gets powerful enough to compete with other tools, it will win because real people might actually want to use it.

Bill Buxton strikes again

A few weeks ago, I saw Bill Buxton give a talk at UW for the Puget Sound SIGCHI meeting.

The main takeaway for me was Buxton’s call to study and learn from the history of design. “Know the good parts of history and what to cherry pick from it.” For example, the original set of five-color iPods took many design cues from an early consumer camera product line. “[Apple design chief] Jonathan Ive knows the history.”

Buxton showed photos of what he called the first educational technology device: the PLATO IV from 1972. It included graphical output and a touch screen, and was apparently put in schools all over Illinois. The similarities to the iPad are striking. He demoed a watch from 1984 that includes a touch surface capable of doing character recognition on decimal digits. It sold for just $145 (in today’s dollars). Buxton also took a look at the first real smartphone: the “Simon” from 1993. It is shockingly similar to the iPhone, complete with a row of app icons on the home screen. The only app “missing” is a web browser (the html web was still a research novelty in 1993).

There were many other examples which I didn’t note specifically, many of them MIT Media Lab prototypes published in SIGGRAPH. Buxton also pointed the audience to archive.org for more, such as a video on input devices in 1988.

The second takeaway was Buxton’s theory of the “long nose”: it takes about 20 years to go from an idea to a $1 billion industry. In other words, “Any technology that is going to have significant impact over the next 10 years is already at least 10 years old.” So the important act of innovation is not the “lightbulb in the head” but rather picking out the correct older ideas that haven’t yet hit the elbow of the exponential curve. When change is slow, humans don’t tend to notice; but you can counteract that by explicitly measuring the change as technology progresses. What are the technologies invented 20 years ago that are about to become huge?

Still Magical

As part of testing our upcoming iOS 4.2 release of OmniGraphSketcher for iPad, I just threw together this graph — a more or less exact replica of a textbook economics diagram.  All on the iPad, without any fuss.

Economics diagram f on OmniGraphSketcher for iPad

I could email the PDF directly to a textbook publisher.

Despite the fact that I’ve been working on this app ever since the iPad was announced, the whole thing still kind of boggles my mind. Even though I know in detail how the app works, Apple’s term “magical” is the best way I know of to describe the experience of using it.

Sculley on Apple

At the risk of being like all the other bloggers, I feel the need to write down what I didn’t know or found most interesting about the recent interview with John Sculley, who was the CEO at Apple for about ten years.

First, Sculley made it clear that Apple has always been a digital consumer electronics company, and in many ways was the first such company. As digital components continue to become cheaper and more powerful, thus enabling better consumer electronics, it makes sense that Apple will continue to thrive. Another way of saying this is that Apple was so ahead of its time that even after 25 years the market conditions are only beginning to really align with their strategy.

Second, I often explain to people how Apple’s control of both hardware and software is key to their ability to innovate. There is a lot they simply couldn’t do if they didn’t control the whole pipeline. Sculley recounts an excellent example of this, dating back to the first Mac.

Sculley: The original Mac really had no operating system. People keep saying, “Well why didn’t we license the operating system?” The simple answer is that there wasn’t one. It was all done with lots of tricks with hardware and software. Microprocessors in those days were so weak compared to what we had today. In order to do graphics on a screen you had to consume all of the power of the processor. Then you had to glue chips all around it to enable you to offload other functions. Then you had to put what are called “calls to ROM.” There were 400 calls to ROM, which were all the little subroutines that had to be offloaded into the ROM because there was no way you could run these in real time. All these things were neatly held together. It was totally remarkable that you could deliver a machine when you think the first processor on the Mac was less than three MIPs (Million Instructions Per Second). (NOTE. For comparison, today’s entry-level iMac uses an Intel Core i3 chip, rated at over 40,000 MIPS!)

This approach continues to be important in every category of device Apple produces. For example, today’s iPhones and iPads have special-purpose hardware for video decoding (that’s why they can only play movies encoded in certain formats). Microsoft could not really enter the graphical operating system market until standard processors and graphics cards became powerful enough to do most of the graphics routines themselves. If Microsoft produces an innovative operating system that requires hardware that is too specialized or not readily available, the hardware manufacturers will say, “sorry, we can’t support it.” At Apple, Steve Jobs says, “We will find a way.”

There’s a great little quote about a meeting at Microsoft. “All the technical people are sitting there trying to add their ideas of what ought to be in the design. That’s a recipe for disaster.” Sculley thinks that’s part of the silicon valley culture started by HP, where engineers are most respected. At Apple, designers are on top. “It is only at Apple where design reports directly to the CEO.”

Sculley repeats over and over how good a designer Steve Jobs is. The following story is about a visit to the inventor of the Polaroid camera.

Dr Land had been kicked out of Polaroid. He had his own lab on the Charles River in Cambridge. It was a fascinating afternoon because we were sitting in this big conference room with an empty table. Dr Land and Steve were both looking at the center of the table the whole time they were talking. Dr Land was saying: “I could see what the  Polaroid camera should be. It was just as real to me as if it was sitting in front of me before I had ever built one.”
And Steve said: “Yeah, that’s exactly the way I saw the Macintosh.”

Dr Land had been kicked out of Polaroid. He had his own lab on the Charles River in Cambridge. […] We were sitting in this big conference room with an empty table. Dr Land and Steve were both looking at the center of the table the whole time they were talking. Dr Land was saying: “I could see what the  Polaroid camera should be. It was just as real to me as if it was sitting in front of me before I had ever built one.”

And Steve said: “Yeah, that’s exactly the way I saw the Macintosh.”

The item which I did not know (or maybe had forgotten) is that Apple’s Newton product played a central role in the creation of the ARM processor design, which is now used across the industry in most smartphones and other consumer electronics (including the iPhone, iPad, and Apple TV). Moreover:

The Newton actually saved Apple from going bankrupt. Most people don’t realize in order to build Newton, we had to build a new generation microprocessor. We joined together with  Olivetti and a man named Herman Hauser, who had started Acorn computer over in the U.K. out of Cambridge university. And Herman designed the ARM processor, and Apple and Olivetti funded it. Apple and Olivetti owned 47 percent of the company and Herman owned the rest. It was designed around Newton, around a world where small miniaturized devices with lots of graphics, intensive subroutines and all of that sort of stuff… when Apple got into desperate financial situation, it sold its interest in ARM for $800 million. […] That’s what gave Apple the cash to stay alive.

So while Newton failed as a product, and probably burnt through $100 million, it more than made it up with the ARM processor.

So the product that has become a “celebrated failure” of business school lore was actually one of the most successful products in the industry, when taking the long-term technological view. Not only did it save Apple from bankruptcy, it enabled an entirely new category of power-efficient, mobile computing devices. True, the marketing and business strategy used for the Newton failed. But the underlying technologies and vision not only saved Apple in the 90’s, but also formed the basis of its remarkable growth today.

Everything Apple does fails the first time because it is out on the bleeding edge. Lisa failed before the Mac. The Macintosh laptop failed before the PowerBook. It was not unusual for products to fail. The mistake we made with the Newton was we over-hyped the advertising. We hyped the expectation of what the product could actually [do], so it became a celebrated failure.

Apple has gotten much better at this over time. They are often accused of over-hyping their products, but if you look carefully, Steve Jobs and Apple advertising never claims that their products can do anything they can’t actually do. They say things like “1000 songs in your pocket” and show people dancing to those songs. Some people react with “oh my gosh, that’s what I actually want.” Other people say, “no thanks.”

Technologists complained when Apple described the iPad as “magical.” Obviously the device is not actually magical. But I think that word pretty well describes the actual reaction of many customers to the product.

Airline infographic

An interesting infographic is being passed around today from the New York Times. (Click the excerpt below for the full graphic.)

Airline info graphic

I suspect that much of this was hand-crafted using an illustration program. Notice how the merger at the right end of Delta’s row maintains the vertical center from before the merger, whereas in most other places the incoming merger adjusts the vertical center by adding on to the top or bottom of the previous bar.

Instinct

“Instinct… is largely memory in disguise. It works quite well when it is trained, and poorly otherwise.”

-Robert Bringhurst (The Elements of Typographic Style)

Form Follows Fiasco

We can always use a reminder to keep it simple. This one comes from the thoughtful and amusing textbook Form Follows Fiasco by Peter Blake, published in 1977. (A friend recommended the book, which was not in the public library holdings but was available used on Amazon for about $4.) In this passage, he is discussing one of the problems with construction via prefabricated modules.

Many wonderfully inventive designers spent decades, if not lifetimes, trying to perfect the absolutely perfect, universal joint — the magic mechanical device that would join their modular panels together in wedlock (yet leaving open the possibility of some future disengagement, for the sake of greater post-marital flexibility).

But it was all in vain. The universal joints, the seams, the gaskets, the unbelievably ingenious interlocking connectors — many of them leaked, wracked, delaminated, or experienced some sort of material fatigue. Yet jointitis — a disease increasingly prevalent among theorists in prefabrication — continued to spread. One of prefabrication’s most illustrious pioneers designed a joint to connect two or more wooden panels; it was a miracle of ingenuity, and required little more from the on-site joiners than a doctorate in Chinese puzzling. The pioneer, it seemed, had never been told of an earlier and less sophisticated joint used in wood-framing, known as the nail.

My take on the overarching theme of the book is that there’s something to be said for a little messiness. The straight, clean, orderly, centrally planned structures of Modernist architecture and Modernist urban planning sound good in theory. But in practice, they are expensive to keep straight and pure, so before long they become ugly (stained cement walls). They are also bland and boring because they are so simple (high-rise apartments and suburbia). And they are inefficient because they artificially standardize (modular approaches that are ok for many uses but not great at anything) and require connecting artificially separated functions (rush hour in heavily zoned cities, and the “universal joints” discussed in the above passage).

The alternative is to turn to more practical, locally and organically designed, human-centric (not technology-centric), financially sustainable structures. He points to examples of old wood-and-brick buildings that have been completely repurposed but still work great; vibrant urban centers like SoHo, which was designed organically by new residents violating the zoning laws; and structures such as Grand Central Station which hide all of the technology (trains, subways, electricity, plumbing) to make a welcoming, functional, human-centric space.

As always, there is a balance to be found between order and disorder, predictability and randomness. The Modernist movement was in many ways a reaction against the disorder and uncleanliness of previous eras. Form Follows Fiasco and more recent trends swing back from the extremely sculpted order of Modernist plans to reintroduce what they hope is a healthy dose of messiness.