Chaos: Making a new science

If there was any doubt that science is driven by people politics as much as anything else, look no further than James Gleick’s 1987 book, Chaos: Making a new science.

The book chronicles the history of chaos theory; but “chaos” is also a good word to describe the scientific community’s embarassingly slow acceptance of the findings and tools of this new mathematical subfield.

The book held particular interest for me because of an unsolved mystery in a research paper I wrote in college, Weather forecasting by computer. Edward Lorenz, who published the first research on what would become chaos theory, calculated in the late 1960s that “even with perfect models and perfect observations, the chaotic nature of the atmosphere would impose a finite limit of about two weeks to the predictability of the weather.” Despite this, I was reading brash predictions in books published in the early 1980s that we would soon be able to forecast the weather months or years into the future. Why did it take more than a decade for this fundamental mathematical result to make its way even to experts writing about weather forecasting?

Gleick wondered the same thing. The fact that his conclusions took the form of an entire book is testament to the many factors at play. (I was a bit relieved to confirm that I wasn’t just missing something obvious.)

Part of the answer is that chaos theory was outside the scope of existing academic disciplines, almost by definition. It tried to make sense of problems that couldn’t be solved using traditional mathematics — the very problems that most researchers (and entire science departments) stayed away from because the chances of progress seemed slim. Over time, disciplinary boundaries developed such that most of these problems were not considered valid topics in physics, biology,… and even weather forecasting.

A second part of the answer is that many of the important results of chaos theory themselves defined limitations on what is possible to know or achieve, especially when seen through the lens of traditional approaches. Scientists and other leaders didn’t want to believe these pessimistic claims, and they were easy to ignore when coming from a suspicious fringe group of career-insensitive mathematicians.

A third part of the answer is that even when the essential properties of chaos theory had been well established by mathematicians, the theory was not useful to mainstream scientists until practical mathematical tools were developed. Several important mathematical results eventually helped to show how disparate data sets all displayed chaotic “bifurcations” and “period doublings,” for example. As scientists were given more concrete patterns to look for, evidence of chaotic behavior became increasingly visible to them.

And yet a fourth part of the answer is that the main tools used to investigate chaos theory — computers — were new and unfamiliar to mainstream scientists. Lorenz was one of very few theorists in the 1960’s who had access to expensive computer time (and the knowledge to use it). And although rigorous mathematical proofs were eventually found for many components of chaos theory, for many years the most important results were simply the outputs of clever computer programs. Running experiments like this via numerical simulation was a totally new approach. Scientists and mathematicians had every reason to be skeptical.

At the time Gleick’s book was published, chaos had finally become broadly accepted in science and had led to a few high-profile applications such as heart pacemakers. Yet even now, 20 years later, chaos theory is not part of the standard curriculum at any level of school. I studied it for a few weeks in high school as part of a special end-of-year diversion; and in college as an elective math course that was only offered one semester every other year. And I went to very progressive schools. When Steven Wolfram unveiled his “new kind of science”, non-experts missed the fact that he was talking about this same line of research. The new science is still in its infancy.

Acting

“…acting is an instrument of freedom, which enables people to realize that they are not imprisoned in themselves…”

-Theodore Zeldin

Automatic color temperature

Bill and I had an interesting conversation today about how computer displays (particularly on mobile devices) should automatically adjust not only the brightness of the backlight, but the color temperature of the pixels. For example, in a room with warm lighting, the “white” on the display should look reddish, the same color as the reflection of that ambient light off white paper.

See Bill’s blog post for the full story.

Sculley on Apple

At the risk of being like all the other bloggers, I feel the need to write down what I didn’t know or found most interesting about the recent interview with John Sculley, who was the CEO at Apple for about ten years.

First, Sculley made it clear that Apple has always been a digital consumer electronics company, and in many ways was the first such company. As digital components continue to become cheaper and more powerful, thus enabling better consumer electronics, it makes sense that Apple will continue to thrive. Another way of saying this is that Apple was so ahead of its time that even after 25 years the market conditions are only beginning to really align with their strategy.

Second, I often explain to people how Apple’s control of both hardware and software is key to their ability to innovate. There is a lot they simply couldn’t do if they didn’t control the whole pipeline. Sculley recounts an excellent example of this, dating back to the first Mac.

Sculley: The original Mac really had no operating system. People keep saying, “Well why didn’t we license the operating system?” The simple answer is that there wasn’t one. It was all done with lots of tricks with hardware and software. Microprocessors in those days were so weak compared to what we had today. In order to do graphics on a screen you had to consume all of the power of the processor. Then you had to glue chips all around it to enable you to offload other functions. Then you had to put what are called “calls to ROM.” There were 400 calls to ROM, which were all the little subroutines that had to be offloaded into the ROM because there was no way you could run these in real time. All these things were neatly held together. It was totally remarkable that you could deliver a machine when you think the first processor on the Mac was less than three MIPs (Million Instructions Per Second). (NOTE. For comparison, today’s entry-level iMac uses an Intel Core i3 chip, rated at over 40,000 MIPS!)

This approach continues to be important in every category of device Apple produces. For example, today’s iPhones and iPads have special-purpose hardware for video decoding (that’s why they can only play movies encoded in certain formats). Microsoft could not really enter the graphical operating system market until standard processors and graphics cards became powerful enough to do most of the graphics routines themselves. If Microsoft produces an innovative operating system that requires hardware that is too specialized or not readily available, the hardware manufacturers will say, “sorry, we can’t support it.” At Apple, Steve Jobs says, “We will find a way.”

There’s a great little quote about a meeting at Microsoft. “All the technical people are sitting there trying to add their ideas of what ought to be in the design. That’s a recipe for disaster.” Sculley thinks that’s part of the silicon valley culture started by HP, where engineers are most respected. At Apple, designers are on top. “It is only at Apple where design reports directly to the CEO.”

Sculley repeats over and over how good a designer Steve Jobs is. The following story is about a visit to the inventor of the Polaroid camera.

Dr Land had been kicked out of Polaroid. He had his own lab on the Charles River in Cambridge. It was a fascinating afternoon because we were sitting in this big conference room with an empty table. Dr Land and Steve were both looking at the center of the table the whole time they were talking. Dr Land was saying: “I could see what the  Polaroid camera should be. It was just as real to me as if it was sitting in front of me before I had ever built one.”
And Steve said: “Yeah, that’s exactly the way I saw the Macintosh.”

Dr Land had been kicked out of Polaroid. He had his own lab on the Charles River in Cambridge. […] We were sitting in this big conference room with an empty table. Dr Land and Steve were both looking at the center of the table the whole time they were talking. Dr Land was saying: “I could see what the  Polaroid camera should be. It was just as real to me as if it was sitting in front of me before I had ever built one.”

And Steve said: “Yeah, that’s exactly the way I saw the Macintosh.”

The item which I did not know (or maybe had forgotten) is that Apple’s Newton product played a central role in the creation of the ARM processor design, which is now used across the industry in most smartphones and other consumer electronics (including the iPhone, iPad, and Apple TV). Moreover:

The Newton actually saved Apple from going bankrupt. Most people don’t realize in order to build Newton, we had to build a new generation microprocessor. We joined together with  Olivetti and a man named Herman Hauser, who had started Acorn computer over in the U.K. out of Cambridge university. And Herman designed the ARM processor, and Apple and Olivetti funded it. Apple and Olivetti owned 47 percent of the company and Herman owned the rest. It was designed around Newton, around a world where small miniaturized devices with lots of graphics, intensive subroutines and all of that sort of stuff… when Apple got into desperate financial situation, it sold its interest in ARM for $800 million. […] That’s what gave Apple the cash to stay alive.

So while Newton failed as a product, and probably burnt through $100 million, it more than made it up with the ARM processor.

So the product that has become a “celebrated failure” of business school lore was actually one of the most successful products in the industry, when taking the long-term technological view. Not only did it save Apple from bankruptcy, it enabled an entirely new category of power-efficient, mobile computing devices. True, the marketing and business strategy used for the Newton failed. But the underlying technologies and vision not only saved Apple in the 90’s, but also formed the basis of its remarkable growth today.

Everything Apple does fails the first time because it is out on the bleeding edge. Lisa failed before the Mac. The Macintosh laptop failed before the PowerBook. It was not unusual for products to fail. The mistake we made with the Newton was we over-hyped the advertising. We hyped the expectation of what the product could actually [do], so it became a celebrated failure.

Apple has gotten much better at this over time. They are often accused of over-hyping their products, but if you look carefully, Steve Jobs and Apple advertising never claims that their products can do anything they can’t actually do. They say things like “1000 songs in your pocket” and show people dancing to those songs. Some people react with “oh my gosh, that’s what I actually want.” Other people say, “no thanks.”

Technologists complained when Apple described the iPad as “magical.” Obviously the device is not actually magical. But I think that word pretty well describes the actual reaction of many customers to the product.

Airline infographic

An interesting infographic is being passed around today from the New York Times. (Click the excerpt below for the full graphic.)

Airline info graphic

I suspect that much of this was hand-crafted using an illustration program. Notice how the merger at the right end of Delta’s row maintains the vertical center from before the merger, whereas in most other places the incoming merger adjusts the vertical center by adding on to the top or bottom of the previous bar.

High Dynamic Range

I’m impressed with the new “high dynamic range” (HDR) photography feature in iOS 4.1 for iPhone 4. The feature basically takes three versions of the image in quick succession, each using a different light setting. Software then combines the three photos using image processing algorithms. The goal is to avoid washed-out bright areas and dark, almost-black shadowed areas.

I took the picture below with HDR turned on. I did not use a tripod, did not set anything manually, and did no post-processing other than cropping. (Click it to see full resolution.)

Seattle skyline using iPhone with HDR

The plain, non-HDR version of the image looked pretty good too, but everything was more washed out, especially the buildings and sky. The trees were a bit brighter but didn’t look as rich. I think the HDR version looks astonishingly professional.

Math in Ancient India

I just checked my web server statistics and found that part of my high school research paper on the history of mathematics is getting well over a thousand requests a month.

The topic of that paper is “the usually unrecognized achievements of Vedic and Hindu mathematicians from 2000-300 B.C.” When writing it, I was surprised at how hard it was to find good research on the topic:

In Mathematical Thought from Ancient to Modern Times, [a “comprehensive” summary of] the history of mathematics, [author Morris Kline] included only half a chapter (out of 50 chapters total) on Indian math. Everything he said seemed to sneer at them, put them down, and belittle their accomplishments.

Despite this dearth of understanding, the facts were clear:

Besides using simple arithmetic operations like addition, subtraction, etc., Indians invented the decimal system and the idea of positional notation, both of which are still in use today. They also used the “Pythagorean” theorem and “Pascal’s” triangle long before either of those men were born!

Today, it turns out that if you google “Sulva Sutras” (the title of the web page getting most of the visits), my research paper is the first result, above Wikipedia and everything else! If you search for “mathematics in vedas”, I’m the third result.

True, this seeming popularity may have something to do with spelling inconsistencies — the Wikipedia article uses “Shulba Sutras” and is the first result if you search with that spelling. It’s also true that my paper was written in 1999 and has been on the web since 2003, so has had time to gather links from other websites (which influence Google’s ranking).

But we’re talking about the founding documents of mathematics! The origin of zero! The “Pythagorean” theorem, recorded hundreds of years before Pythagoras! And the most relevant article was written by a fifteen-year-old?

In my research paper’s conclusion (which is mostly too embarrassing to quote), I wrote, “I find it unbelievable how little work has been done in the field…. The vast majority of the work has been done only by Indians. Most of the books on the subject are written in Sanskrit or Hindi [and are ignored by] eurocentric scholars.”

Ten years later, my incredulity lives on.

Steve Jobs’ fifth revolution

I think one of the most significant announcements at Apple’s media event today was that the iPod touch now has over 50% market share in the worldwide portable gaming industry — the iPod touch outsells the portable game consoles from Nintendo, Sony, and all other manufacturers, combined. Steve Jobs also said over 1.5 billion games have been downloaded so far to iPod touches alone. “It has become by far the most popular game player in the world.”

Steve Jobs discusses the iPod touch as a gaming platform

It’s widely recognized that Jobs has already revolutionized four industries: personal computers (Mac), digital music (iPod and iTunes), animated films (Pixar), and smartphones (iPhone). I think it’s now safe to add a fifth to that list: portable gaming.

His impact is a revolution both in terms of the new multi-touch user interface for gaming and the App Store platform for game distribution and payment. The major products involved are not just the iPod touch but also the iPhone and iPad.

So what will be number six?

Apple is making some progress on movies and TV shows. However, the studios and cable companies have all the power, and they are terrified about what happened to the music industry. It’s hard to find a path that transitions the industry from cable TV “channels” to browsing and paying for individual shows.

Another possibility for revolution is in textbooks and online education, where iTunes already carries recorded lectures and the iPad has started to inspire a new class of interactive educational content.

7″ iPad

I continue to see rumors that Apple will release a 7-inch iPad. The idea is that it would be closer in size to a paperback or Kindle; lighter and less expensive than the current iPad; and easier to fit in a purse.

I’m a bit scared of this vision because it means all of our apps would have to be redesigned for yet another screen size. iPhone apps already do run on the iPad, but they are awkward to use. Scaled-down iPad apps are not really an option because the touch targets would be too small. Apple could use a screen with the same number of pixels as the iPhone 4 (but bigger in size); that way, all retina-display-compatible apps would fit pixel-by-pixel on the device. Still, graphics would look too big and some interactions would still be awkward. In short, redesigning our apps would be necessary for a good user experience.

This redesign will be a lot easier than porting apps from the Mac to the iPad. Still, for the sake of my own sanity, I hope Apple waits a while before introducing the next screen size.


Update: On the other hand, most of our existing Mac apps have to be designed to work well at any screen resolution between the 13″ MacBook and the 30″ Cinema Display. From this point of view, having to support just two discrete iPad sizes should be comparatively easy.


Update 2: Steve Jobs just criticized the 7-inch form factor.