By Nathaniel Bullard

Technology improvements so dramatic it’s depressing... for energy statistics

Posted on August 03, 2017

FIFTEEN YEARS ago, Japan’s Earth Simulator was the most powerful supercomputer on Earth. It had more than 5,000 processors. It consumed 6,400 kilowatts of electricity. It cost nearly $400 million to build.

Two weeks ago, a computer engineer built a “deep learning box,” using off-the-shelf processors and components, that handily exceeds the Earth Simulator’s capabilities. It uses a maximum of one kilowatt of power. It cost $3,122 to build.

For the first time in writing this, I’m stumped for a chart. It is difficult -- perhaps impossible -- to show a 99.98% reduction in energy use and a 99.99992% reduction in cost in any meaningful way. It is enough to say that information technology has decreased in cost and increased in computational and energy efficiency to striking degrees.

I would argue that this dramatic improvement has a flattening, or even depressing, economic influence on energy. Dramatically reduced inputs with dramatically increasing outputs is a boon for consumers and businesses, unless those businesses sell the energy that drives those inputs. We’ve already seen this: In 2007, US data centers consumed 67 terawatt-hours of electricity. Today, with millions of times more computing power, they consume … 72 terawatt-hours, with less than 1% growth forecast by 2020. Not the greatest news if you’re a power utility that has imagined that more and more information technology will mean more energy demand.

Information technology’s improvement over time has been largely a function of Moore’s Law (which is less a law than an observation). Now, with Moore’s Law potentially coming to its end, it would seem like the extraordinary improvements that got us from a room-sized $400-million supercomputer to a $3,000-desktop box in 15 years could be coming to an end, too.

If technology companies are no longer able to jam more transistors into a chip, does that mean that improvements in energy consumption will also come to an end? If chip improvements plateau, and deployment increases, can information technology find a way to provide a boost to energy demand?

I doubt it, for both hardware and software reasons.

Even as Moore’s Law is tapping out for general-purpose chips, hardware is becoming increasingly optimized for specific tasks. That optimization -- for such things as graphics processing or neural network computations for machine learning -- leads to greater energy efficiency, too. Google now has its own application-specific integrated circuit called the Tensor Processing Unit (TPU) for machine learning. The TPU “delivered 15-30x higher performance and 30-80x higher performance-per-watt” than central processing units and graphics processing units.

Then there is the software that runs on that custom hardware, which has direct applications for electricity in particular. Last year, Google unleashed its DeepMind machine learning on its own data centers and “managed to reduce the energy used for cooling those data centers by 40%.”

So, new special-purpose chips are much more energy-efficient than older general-purpose chips … and those efficient chips are now used to run algorithms that make their data centers much more energy-efficient, too.

In a famous 1987 paper, the economist Robert Solow said “you can see the computer age everywhere but in the productivity statistics.” Today, we could say the same about the computer age and energy statistics. -- Bloomberg

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.