The future is here, and it’s weird: Google is now putting a self-taught algorithm in charge of a part of its infrastructure.
It should surprise no one that Google has been intensively working on artificial intelligence (AI). The company managed to develop an AI that beat the world champion at Go, an incredibly complex game, but that’s hardly been the only implementation. Google taught one of its AIs how to navigate the London subway, and more practically, it developed another algorithm to learn all about room cooling.
They had the AI learn how to adjust a cooling system in order to reduce power consumption, and based on recommendations made by the AI, they almost halved energy consumption at one of their data centers.
“From smartphone assistants to image recognition and translation, machine learning already helps us in our everyday lives. But it can also help us to tackle some of the world’s most challenging physical problems — such as energy consumption,” Google said at the time.
“Major breakthroughs, however, are few and far between — which is why we are excited to share that by applying DeepMind’s machine learning to our own Google data centres, we’ve managed to reduce the amount of energy we use for cooling by up to 40 percent.”
The algorithm learns through a technique called reinforcement learning, which uses trial and error. As it learns, it starts to ask better questions and design better trials, which allows it to continue learning much faster. Essentially, it’s a self-taught method.
In this particular case, the AI tried different cooling configurations and found ways that greatly reduced energy consumption, saving Google millions of dollars in the long run as well as lowering carbon emissions for the data center.
Now, Google took things one step further and has completely assigned control of the cooling center to the AI. Joe Kava, vice president of data centers for Google, says engineers already trusted the system, and there were few issues regarding the transition. There’s still a data manager that will oversee the entire process, but if everything goes according to plan, the AI will manage the entire process on its own.
This is no trivial matter. Not only does it represent an exciting first (allowing an AI to manage an important infrastructure component), but it also may help reduce the energy used by data centers, which can be quite substantial. A recent report from researchers at the US Department of Energy’s Lawrence Berkeley National Laboratory concluded that US data centers accounted for about 1.8% of the overall national electricity use.
Efforts to reduce this consumption have been made, but true breakthroughs are few and far between. This is where machine learning could end up making a big difference. Who knows — perhaps the next energy revolution won’t be powered by human ingenuity, but rather by artificial intelligence.