View all Articles
Commentary By Mark P. Mills, Peter W. Huber

Got a Computer? More Power to You.

It takes electrons to move bits. The digital economy, which most everyone loves, is completely dependent on the big central power plant, which most everyone hates. This is, or ought to be, an inconvenient fact for many politicians. Especially for those who would take credit for our digital prosperity, while opposing, on save-the-earth grounds, just about every fuel and technology that can deliver affordable power in the quantities that the digital economy requires.

Oracle's campus, in the heart of Silicon Valley, is a 13-megawatt electric load. Sun, spread across six campuses in the Valley, uses about 26 megawatts. These companies now consume as much power as small steel mills. And their requirements are growing over 7% per building, per year. Power consumption in the Valley has been growing at three times the rest of California's. Last Friday, after a summer that brought the Golden State to the brink of electric-power collapse, the California legislature adopted a bill to speed up the licensing and review of new power plants.

Leading Edge

Far from unique, Oracle and Sun typify the leading edge of the digital economy. Most of the extra power that the country now requires is being used in silicon chips and the optical and radio links that unite them across the Web.

A personal computer and its peripherals typically boost power consumption in your home by about 5% per year—more if you run with the Wired crowd. But most of digital infrastructure's burgeoning demand for power occurs out of sight. Almost all the power it takes to deliver a conventional TV picture is used in the den where you watch it. The Web's invisible infrastructure, by contrast, consumes at least twice as much power as the hardware on your desktop. A Palm Pilot sips tiny amounts of power—but connect it to the Web and it can add as much new electric load as a refrigerator, in the servers, routers, and digital transmission systems used to feed it.

If the digital world could take its power for granted until recently, it was only because Thomas Edison had a century's headstart over Andy Grove. But the Intels, Oracles, Ciscos and Suns are now overtaking the electric power infrastructure they inherited. A year ago we estimated that some 13% of U.S. power output was being used to manufacture and run computers and the sprawling information technology infrastructure. It's more than that today. The country is "siliconizing" everything—and behind every digital bit stands a bucket of electrons, behind every chip a power supply. The amount of power it takes to create, process, or transmit a single bit is cut in half every eighteen months or so. But the number of bits in play is doubling much faster.

Turbine manufacturers and power plant operators certainly know how to generate electrons in the quantities needed. But from minehead to meter, electric power is rigidly controlled by economic regulators on one side, and green regulators on the other, none of whom foresaw silicon's surging appetite for power.

Until quite recently, many regulators were persuaded by pundits who maintained that we had already built the last big power plant we'd ever need. Light bulbs and motors had created the first great wave of demand for electricity a century ago; in the 1950s, air conditioning created the second. By the end of the 1970s, it was supposed to be all over—efficiency and conservation were going to take over from there.

"Only minor increases in electricity consumption occur" in the future, the Union of Concerned Scientists projected in 1980. Electricity consumption has in fact risen over 60% in the two decades since. There were just 20,000 servers in operation in 1995; there are over 6 million today, linked to some 200 million PC-class units beyond.

In recent years, the efficiency of bulbs and refrigerators has indeed risen a lot faster than our demand for more light and ice. But end-of-growth futurists were wrong—inexcusably wrong—in assuming that bulbs and ice marked the end of electrically powered stuff. They didn't have to anticipate Intel or Netscape. All they had to do was study industrial history, which teaches that something new, and with a plug at its tail end, invariably comes along.

Bad forecasts, though amusing in retrospect, can have serious consequences when made. For a long stretch, and nowhere more so than in California, the no-power-growth pundits held sway with regulators, and through them, utilities. The building of new power plants and transmission facilities ground to a halt. Now California, New York, and other states are scrambling to reverse course. But it will take years to rehabilitate and expand a system this huge. And power demands are now growing at twice the rates planned for just a few years ago.

Don't expect yesterday's no-growth authorities to confess error, least of all in an election year. Most of them now concede that "the grid" must be made "more reliable." They're right, the grid does need a lot of work—but doing it won't address a more fundamental problem. The grid does not produce electrons, it merely transmits them.

For increased power production, most of yesterday's no-growth crowd now tout "micropower"—such distributed generation technologies as solar and wind, or maybe fuel cells. Such technologies, they argue, will let us home-brew our own electrons, cleaner and more reliably, and right in our very own basements.

This is mainly wishful thinking. Most of the micropower technologies in actual use—diesel generators and small turbines—are much less efficient, and therefore far more expensive to operate, than the big, central-station units for which they might, at the margin, substitute. Most end up dirtier, too, as a result. Adding back-up generators is certainly essential to increase reliability, and that is why billions of dollars of such units are now being sold. But the micropower technologies at hand can't come remotely close to accommodating even a small fraction of the 3%-4% annual increases in power demand that now lie ahead, not at a cost that ordinary consumers will accept.

Finally, many of the no-growth school still cling to the hope that the problem somehow contains its own solution, that by consuming more energy in chips and fiber-optic lines, we will consume less on highways or in warehouses. Al Gore apparently subscribes to this view. "Already, microprocessors are reducing energy consumption and managing energy flows within machines, causing some truly dramatic reductions in the amounts of energy required," he argues in his recently re-released book, "Earth in the Balance."

Perhaps. By consuming more electricity (and thus, ironically, more coal, which is still the primary fuel behind our electric grid) we might indeed end up consuming less oil and gas, the primary fuels in our transportation and heating sectors. But so far, none of the bottom-line numbers lend any support to that fond hope. We are telecommuting more, and driving more, too. We're ordering more from Amazon.com, and heating more warehouses as well. The digital economy has made us more efficient, but it has also made us richer, and so far, wealth has raised our overall appetite for energy a lot more than digitally enhanced efficiency has curbed it.

Big Irony

Many of the people most responsible for the current power mess are still in deep denial. These, after all, are the people who have found it all but impossible to reconcile their own antipathy for Big Oil with the ordinary citizen's love for his car. Now they must square their antipathy for Big Power with the digital technologies that they themselves embrace, or even claim to have invented. This may be more squaring than some of them can manage.

If pressed, they'll insist we can have it all, more bits, fewer electrons, and less energy consumption overall. We can't. For the foreseeable future, the information age will be powered mainly by coal, uranium, gas, and even oil, by gargantuan furnaces, steam turbines, and generators—not by political hot air.