Saturday, December 15, 2007

Freeconomics
Chris Anderson
The World in 2008


Online, there really is such a thing as a free lunch

Illustration by Steve Carroll
Click to enlarge

In 1954, at the dawn of nuclear power, Lewis Strauss, the head of the Atomic Energy Commission, promised that we were entering an age when electricity would be “too cheap to meter”. That did not happen, mostly because the risks of nuclear energy hugely increased its costs. But what if electricity had in fact become virtually free?

The answer is that everything that electricity touched—which is to say nearly everything—would have been transformed. We would be using electricity for as many things as we could—we would waste it, because it was so cheap that it wasn’t worth worrying about efficiency.

All buildings would be electrically heated. We would all be driving electric cars. Desalination plants would turn sea water into all the fresh water anyone could want, irrigating vast inland swathes and turning deserts into fertile acres, many of them making biofuels. Compared with free electrons, fossil fuels would be seen as ludicrously expensive and dirty, and so carbon emissions would plummet. The phrase “global warming” would never enter the language.

Unlikely? Just such a transformation is already under way, but not in electricity. What is getting too cheap to meter is processing power, storage, bandwidth and all the other enabling technologies of the digital revolution. Thanks to the exponential doublings of Moore’s Law and its equivalents for hard drives and communications, the cost of a given unit of computation, storage or transmission is inexorably dropping towards zero.

One of the first to notice this and consider its implications was a Caltech professor named Carver Mead. In the late 1970s he was reflecting on the amazing learning curve that the combination of quantum mechanics and information science had started on the surface of silicon wafers. Like Moore before him, he could see that the 18-month doublings in performance would continue to stretch out as far as anyone could see. But he went one step further to consider what that implied about computers. He realised that we should start “wasting” them.



Be the first to give away what others charge for

Waste is a dirty word, and no more so than in the 1970s and 1980s. An entire generation of computer professionals had come to power doing just the opposite. In the glass-walled computer facilities of the mainframe era, they exercised their power in deciding whose programs should be allowed to run on the expensive computing machines.

Among Mead’s disciples was Alan Kay, working at Xerox’s Palo Alto Research Centre. Rather than conserve transistors for core processing functions, he developed a computer that would frivolously throw them at such silly things as drawing icons, windows, pointers and even animations on the screen. The point of this profligate eye candy? It was ease of use for regular folks, a previously neglected market. Kay’s work became the inspiration for the Apple Macintosh, which changed the world by opening computing to the rest of us.

Today the same is happening in everything from bandwidth to storage. The learning curves of technology cut prices at a rate never before seen. The cost of storing or transmitting a kilobyte of data really is now too cheap to meter. Soon the same will be true for a megabyte, and then soon after that a terabyte. And the internet touches nearly as much of our economy as electricity did when Lewis Strauss issued his prediction.

Creative disruption

Bandwidth too cheap to meter brought us YouTube, which is revolutionising (and possibly destroying) the traditional television industry, and Skype, which is hollowing out the phone industry. Storage too cheap to meter brought us Gmail, which in 2004 upended the webmail market by increasing—free—the capacity available to all by a factor of 1,000, to say nothing of the huge free photo capacity of Flickr or MySpace’s invitation to put anything on your personal page for no cost.

Before the iPod, people weren’t asking to carry their entire music collection in their pockets. But Steve Jobs and a few others at Apple understood the economics of storage abundance. They could see that disk drives were gaining capacity for the same price even faster than computer processors were. Demand for massive music collections wasn’t driving this—physics and engineering were. Anyone could extrapolate the curves and see what was around the corner, but only the Apple engineers “listened to the technology”, to use Mead’s phrase, and saw that putting 10,000 songs on a drive smaller than a deck of playing cards was going to be possible by 2001.

The dominant business model on the internet today is making money by giving things away. Much of that is merely the traditional media model of using free content to build audiences and selling access to them to advertisers. But an increasing amount of it falls into the free-sample model: because it is so cheap to offer digital services online, it doesn’t matter if 99% of your customers are using the free version of your services so long as 1% are paying for the “premium version”. After all, 1% of a big number can also be a big number.

In 2008, the year of free, Yahoo! will go one better than Google and expand its free webmail to infinity. More music labels will give away music as a promotion for concerts, following Prince’s free distribution of his album in Britain’s Daily Mail in 2007 and Radiohead’s offer to let fans choose their price—free, if they want—when they download the latest album. And more newspapers will publish their content free on the internet.

All this marks a pattern. When the cost of serving a single customer is trending to zero, smart companies will charge nothing. Today, the disrupter’s motto is “Be the first to give away what others charge for”. If you listen to the technology, it makes sense.

No comments:

BLOG ARCHIVE