View all Articles
Commentary By Mark P. Mills

Big Data's Green Creds

Energy, Energy, Cities Technology, New York City

Some anniversaries pass with little fanfare. The Altos, the world’s first personal computer, was invented 40 years ago by Xerox at their Silicon Valley PARC research labs. Seeing it inspired Steve Jobs to create the epoch-changing Mac which led in turn to the proliferation of billions of computers around the globe.

This summer, just five miles from the PARC campus, Google held a remarkable Summit that wrestled with two inter-related issues no one anticipated 40 years ago: the energy implications of billions of electricity-using computers, and the economic realities of the coal-dominated electric grid. The confab, keynoted by Al Gore, was titled “How Green Is the Internet?”

Circa 1973 that question would have been odd, even if you substituted “computing” for “the Internet.” Circa 1933, in the afterglow of Amelia Earhart’s transatlantic flight, it would have been comparably strange to ask if aviation were “green.” Who could foresee that aircraft would link the world and consume 1.5 billion barrels of oil in 2013 alone, India’s annual consumption?

Also unimaginable in 1973 was that by 2013 a single warehouse-scale computer (Google’s phrase) would use as much power as a Boeing 777, and that thousands of such data-centers along with the entire computing ecosystem would now consume more energy annually than global aviation.

How did that happen? Transistors have been shrinking and becoming radically more efficient for decades. But there has been even faster, nearly unimaginable growth in the number of transistors and the associated wired and wireless data networks. So while personal computers now fit in pockets, the Internet’s central computers – data centers -- are the size of shopping malls.

And even though hourly data traffic now surpasses the annual rate in 2000, the data world is going to be much bigger yet. Cisco forecasts that data demand is rising at over 30 percent a year. We are only at the beginning of a great build-out of the electricity-consuming global information-communications-technology (ICT) ecosystem. Microsoft talks about data center “build outs at a scale no one has ever seen before.” Tech industry researchers at IDC estimate capital expenditures of $3 trillion in the coming decade on the global ICT infrastructure. That is a lot of hardware.

Since bits are electrons, the ICT ecosystem is a big and growing user of electricity. Counting everything from PCs and tablets, to data centers for Facebook, NSA and corporations, to wireless and wired networks, and the tech factories, global ICT has risen from nearly nothing to now using more electricity than the countries of Japan and Germany combined. (For details on this ICT energy accounting see here.)

Given the well-publicized “war on coal,” here’s the rub. According to the International Energy Agency (IEA), 68 percent of the world’s new electricity over the past decade came from coal. And the IEA forecasts that even if a hoped-for $4.5 trillion in global alternative energy subsidies are actually spent over the coming two decades, coal will still anchor the grid. Hence the Google green Summit, and an entire panoply of conferences and studies, and corporate green-washing efforts.

Most ICT energy use comes from the billions of PCs, tablets, smartphones, digital TVs, millions of cell towers and trillions of microprocessors. But it is the 20 percent used in data centers, some as big as a million square feet each, that are the political and regulatory greening targets. These are the mainframes of our era, just as IBM’s vaunted 360 Series was when Apple IPO’d in 1980. But today instead of a thousands of corporate computer-rooms guzzling 10 kilowatts each, we see thousands of warehouse-scale computers each inhaling power by the tens of megawatts.

A global survey by DataCenter Dynamics found cost and availability of electricity as the number one concern for data center owners. In the U.S., locating in a State with coal-dominated lower rates can reduce by $350 million the operating costs for a single enterprise-class data center over the life of the ICT hardware. The same survey also found global data center power demand has doubled since 2007.

Facebook just reported an astonishing 33% one-year jump in its electricity use. The electric sector is accustomed to single digit growth rates. Give Facebook credit for revealing on a Facebook page that coal was its single largest source of power (all told, conventional fuels, not renewables, provided 81 percent). Google, meanwhile, recently announced a massive data center expansion in Iowa where it is also invested in wind farms. Iowa, while second only to Texas in wind generation, gets 70 percent of its low-cost and reliable kilowatt-hours from coal.

If your goal is cutting carbon emissions, what do you do about this inconvenient energy reality for ICT? Locating data centers remotely, closer to green power in countries like Iceland and Sweden is limited by physics. Even at light speed, real-time information for financial transactions means distances from users are limited to mere tens of miles. And distances for useful streaming video or real-time operation of everything from factories to hospitals can only be stretched to hundreds not thousands of miles. It is a remarkable fact of a world ostensibly “flattened” by the Internet that proximity still matters.

Data centers are consequently located all over the world, and especially close to and in dense urban cores from downtown Bejing to Manhattan. Bloomberg was just promised a special deal for low-cost power to entice the company to build its new $700 million data center on the outskirts of Manhattan. In short, the electricity choices and challenges for data are no different than for any other power users, with the exception that the need for ultra-reliability is in a league previously isolated to niches like hospitals.

So, data center engineers and operators obsessively pursue efficiency, continually burning fewer kilowatt-hours per gigabit. But radical gains in energy efficiency are what caused the proliferation of computing. At the efficiency of a 1973 IBM mainframe, a single iPad would use 100 MW, more power than Boeing jet, and the power requirements of a single warehouse-scale computer would need the entire Texas electric grid.

Apple, Cisco, Microsoft, Verizon, and the entire ICT coterie, trumpet their sustainable energy goals. Nearly all of it is reasonable. Who wants to be put in the modern equivalent of the stocks, to be lambasted with anti-green epithets. But two stubborn facts remain.

First, data demand can only grow. Billions more people, and trillions of things, will yet connect into the wired and wireless broadband networks. The number of warehouse-scale computers is likely to surpass the number of jumbo jets, and far more energy will be used to transport digital bits than transport people and cargo in airplanes.

Second, with data as with aviation, consumers like service that is fast and low-cost. Speed in aviation and computing always has an energy cost, and for both, hydrocarbons are the cheapest fuels. In the end, cost matters both to economies and the systems that propel them.

Still, some tech companies today seem to be trying to rebrand themselves along the same lines that BP tried with its ill-fated “beyond petroleum” trope. Computing won’t get beyond electricity, and the world’s grids won’t get beyond coal any time soon. Happy anniversary, Altos.

This piece originally appeared in RealClearEnergy

This piece originally appeared in RealClearEnergy