Information Technology Has Created a New Electric Demand Paradigm
Since the days of Thomas Edison and his New York City Pearl Street Station, the first central plant to produce commercial electricity, economists and policymakers have considered growth in electricity demand a core economic bellwether. That’s because the demand for things that used electricity – lights, appliances, motors — grew in near lock-step with the main features of the economy: housing, offices, manufacturing.
Now comes the Energy Information Administration’s latest forecast, featured last week in the Wall Street Journal, predicting a 0.7% per year growth in America’s use of kilowatt-hours (kWh) for the next two decades. Does that portend a future of low economic growth? Not long ago the answer would have been yes. But that metric won’t tell us much today.
The old kWh-GDP link is anchored in a pre-Google, even a pre-AOL world. Over the next two decades we will see a continuation of a transformation that’s been underway since the early 1990s — information-communications technologies (ICT) radically altering the electric sector. ICT makes everything more efficient, so a lot of economic growth can occur with much less electricity used in all the conventional places that burn kilowatt-hours. At the same time however, huge amounts of electricity are now needed to power the ICT sector itself. How do these two countervailing trends balance out?
(We note, as an aside, that overall electric demand went negative in each of the three most devastating years of the Great Recession – but that’s hardly informative given the depths of the losses across the economy.)
Start with how much more efficient we’ve already become. Look at what’s happened in the applications in which we use 80% of our electricity – illumination, cooling/heating, and moving stuff (very little, 0.01%, electricity is used to move humans). Compared to two decades ago, air conditioners and refrigerators use from 70 to 80% less electricity. New commercial lighting typically uses 80 to 90% less electricity. You can’t find an (inefficient) Edison incandescent light bulb in most hotels and offices, and the highly efficient compact fluorescents have invaded many homes as well. The new classes of industrial motors use 50% less electricity compared to two decades ago. These are huge gains in efficiency.
Now look at the growth in population, buildings and the economy. Over the past couple of decades the U.S. population has grown about 25%. Our GDP has grown 60%, and industrial output 50% (yes, it grew, just not as fast as in China). The amount of commercial building space rose about 25%, and total residential space (much less energy-intensive) grew about 50%. You don’t have to be a statistician to see that efficiencies noted above have gained ground far faster than the overall growth in demand.
Put these two trends together and you’d expect electric use would have collapsed. But in fact, over the past two decades, electric demand actually rose an aggregate 35%.
And over the coming two decades, where we can expect even more ICT-driven efficiency gains, the EIA still forecasts a 16% aggregate rise in electric demand. For the record, even that modest 16% added to America’s enormous electric system means we will need to add generating capability equal to the entire power system of Germany.
There is only one thing you can conclude from these trends: something new has emerged and it consumes a lot of kilowatt-hours.
The new factor of course is an astronomical rise in the production and use of all that ICT hardware. Kilowatt-hours are consumed in enormous quantities by everything from the factories where microprocessors are made to the wireless networks, to gargantuan data centers, to the proliferation of ICT devices in homes and offices.
The growth and magnitude of the Internet infrastructure over the past two decades rivals the emergence of the steel industry of the mid-19th century. And it’s not over. While the data are hard to find and isolate for just the U.S., which remains the dominant home for the infrastructure of the digital economy (for now), the global trends are indicative.
According to a Datacenter Dynamics survey, global data-center power requirements grew over 60% just in 2012 – and have risen some 350% over the past five years. Today there are thousands of data centers that simply didn’t exist a decade ago, never mind two decades ago. And they are still being built at a torrid rate. The enterprise-class data centers – Google, Apple, Facebook etc. – sit in buildings that each dwarf a Wal-Mart, and each consume quantities of electricity that rival a steel mill. Demand for data is growing much, much faster than for steel. (For stunning visuals on what data centers look like, check out Google’s helpful site with photos of their infrastructure.)
I’ve written previously about the power trends for data infrastructure. But for our overall purposes here, we note that data centers now collectively use more energy than global aviation (though the latter comes from oil, and the former kWh of course). It bears repeating that data centers, while a high-profile feature of the digital economy, comprise only one component of ICT electric use. One must also account for the communications networks, devices operating in businesses and homes (wherein the TV itself is now being drawn into the Internet’s energy accounting), and the huge portion of the manufacturing sector devoted to producing ICT equipment. Let’s also stipulate what should be eminently obvious: the growth of the data infrastructure of our economy has only just begun.
And so now the most important issue is not just absolute electric demand, but that the character of demand is changing. The trends of the past and coming two decades mean that while the overall ’pie’ is not growing very fast, the share of the pie taken up by information-centric activities has grown rapidly. Information equipment, the Internet and everything associated with it, has a near-universal "always on" imperative. Demand for "always-on" is growing rapidly.
"Always on" is the critical feature of the new economy. Electric utilities in America have entered a new era, if they haven’t already noticed. Demand for resilient and reliable delivery is rising far faster than the absolute demand for power. The future will not be dominated by finding ways to add more renewables to a grid, but by ways to add more resiliency and reliability.
It turns out it’s hard to measure demand for reliability. But we can measure the inverse easily, the trends in unreliability — outage frequency and duration. The average incidence of outages has been increasing at about 8 – 10% per year since 1990. The annual outage duration has also been rising, at about a 14% per year rate. Obviously the system is showing its age and the trends are in the wrong direction.
We can also estimate the costs of outages. Studies show some remarkable numbers. Measured in terms of the costs incurred by users per kilowatt-hour of lost power, (i.e., divide the cost of the outage by the kilowatt-hours that would have been used during the outage), we find outages cost $1 to $20 per kWh in the residential sector, $15 to $200 for large businesses, and $300 to $2,000 per kWh for small businesses.
Outages, in other words, cost ten to ten thousand times more per kWh than the cost of the power itself. It is hard to imagine that in the future our ICT-centric economy will become more tolerant and less economically impacted by outages. All this is bullish for the technologies and businesses that can offer solutions for greater power reliability (out less often) or resilience (out less time).
Engineers know how to make and deliver electricity at quantities equivalent to a super-tanker flowing into NYC every day. Now the harder task is to engineer systems for keeping those wires lit up 24×7, whether in the face of extreme events like Hurricane Sandy, or protecting against terrorism and cyber threats, or just normal course-of-business outages. Count on a lot more pressure on regulators, policymakers and utilities to address this problem. The first order of business will be to figure out metrics for measuring reliability and resilience – not just counting unreliability, outages (that’s easy and annoyingly ex post facto).
The main challenge for the grid is not with the long-haul transmission part of the system, which is remarkably reliable. (See what the PJM, the biggest long-haul grid manager, has accomplished as one example; it withstood the wrath of Hurricane Sandy.) The big challenges are with the local distribution system that can be brought down not only by the likes of Sandy and lesser storms, but also by everything from car crashes to squirrels, the latter the single most common outage vector.
This piece originally appeared in Forbes
This piece originally appeared in Forbes