Google is putting DeepMind’s machine learning to work on managing their sprawling data centers’ energy usage, and it’s is performing like a boss — the company reports a 15% drop in consumption since the AI took over.
Google is undeniably a huge part of western civilization. We don’t search for something on the Internet anymore, we google it. The company’s data servers pretty much handle all of my mail at this point, along with YouTube, social media platforms and much more. But even so, it’s easy to forget that the Google we know and interact with every day is just the tip of the iceberg; it relies on huge data servers to process, transfer and store information — and all this hardware needs a lot of power.
So much power, in fact, that the company decided to do something about it. On Wednesday, Google said it had proved it could cut the energy use of its data centers by 15% using machine learning from DeepMind, the AI company it bought in 2014. These centers use up significant power to cool and maintain an ideal working environment for the servers — requiring constant adjustments of air temperature, pressure, and humidity.
“It’s one of those perfect examples of a setting where humans have a really good intuition they’ve developed over time but the machine learning algorithm has so much more data that describes real-world conditions [five years in this case]” said Mustafa Suleyman, DeepMind’s co-founder.
“It’s much more than any human has ever been able to experience, and it’s able to learn from all sorts of niche little edge cases seen in the data that a human wouldn’t be able to identify. So it’s able to tune the settings much more subtly and much more accurately.”
Suleyman said that the reduction in power use was achieved through a combination of factors. On one hand, DeepMind is able to more accurately predict incoming computational load — in other words, it could estimate when people accessed more data-heavy content such as YouTube videos. The system also matched that prediction more quickly to the required cooling load than human operators.
“It’s about tweaking all of the knobs simultaneously,” he said.
Ok, so Google’s electricity bill just went down; good for them, but what does this have to do with us? Well, a lot, actually. Data centers gobble up a lot of energy, and that means a lot of greenhouse gas emissions — combined, data centers have emission levels similar to those seen in aviation. When Google first disclosed its carbon footprint in 2011 it was roughly equivalent to Laos’s annual emissions but since then they claim they upped their game, getting 3.5 times as much computational power for the same amount of energy. Using machine learning is only the latest step in optimizing their system. The company began toying with this idea two years ago, and since then they’ve tested it on “more than 1%” of its servers, Suleyman said. It is now being used across a “double-digit percentage” of all Google’s data centers globally and will be applied across all of them by the end of the year.
Using machine learning is only the latest step in optimizing their system. The company began toying with this idea two years ago, and since then they’ve tested it on “more than 1%” of its servers, Suleyman said. It is now being used across a “double-digit percentage” of all Google’s data centers globally and will be applied across all of them by the end of the year. They haven’t released the exact amount of power their data centers use, but claims that in total its activity makes up 0.01% of global electricity use (and most of that probably goes towards the data centers.)
But DeepMind is leaving a considerable mark on their energy efficiency. It cut energy expenditure for cooling by 40%, which reduced the company’s overall power consumption by 15%.
“I really think this is just the beginning. There are lots more opportunities to find efficiencies in data centre infrastructure,” Suleyman added.
“One of the most exciting things is the kind of algorithms we develop are inherently general … that means the same machine learning system should be able to perform well in a wide variety of environments [think power generation facilities or energy networks].”
Sophia Flucker, the director of Operational Intelligence, a UK-based consultancy that advises data centers on their energy use, said it was feasible that Google had achieved such a big reduction.
“I’ve worked with some award-winning data centres, which still had plenty of room for improvement,” she said.