The breathtaking speed at which our computers evolve is perfectly summarized in Moore’s Law — the idea that the sum of transistors in an integrated circuit doubles every two years. But this kind of exponential growth in computing power also means that our chipsets need more and more power to function — and by 2040 they will gobble up more electricity than the world can produce, scientists predict.
The projection was originally contained in a report released last year by the Semiconductor Industry Association (SIA) but it has only recently made headlines as the group issued its final assessment on the semiconductor industry. The basic idea is that as computer chips become more powerful and incorporate more transistors, they’ll require more power to function unless efficiency can be improved.
Energy which we may not have. They predicted that unless we significantly change the design of our computers, by 2040 we won’t be able to power all of them. But there’s a limit to how much we can improve using our methods:
“Industry’s ability to follow Moore’s Law has led to smaller transistors but greater power density and associated thermal management issues,” the 2015 report explains.
“More transistors per chip mean more interconnects – leading-edge microprocessors can have several kilometres of total interconnect length. But as interconnects shrink they become more inefficient.”
So in the long run, SIA estimates that under current conditions “computing will not be sustainable by 2040, when the energy required for computing will exceed the estimated world’s energy production.”
This graph shows the problem. The power requirements of today’s systems — the benchmark line — are the orange line and total energy production is the yellow one. The point they meet at, predicted to be somewhere around 2030 or 2040, is where the problems start. Today, chip engineers stack ever-smaller transistors in three dimensions in order to improve performance and keep pace with Moore’s Law, but the SIA says that approach won’t work forever, given how much energy will be lost in future, progressively denser chips.
“Conventional approaches are running into physical limits. Reducing the ‘energy cost’ of managing data on-chip requires coordinated research in new materials, devices, and architectures,” the SIA states.
“This new technology and architecture needs to be several orders of magnitude more energy efficient than best current estimates for mainstream digital semiconductor technology if energy consumption is to be prevented from following an explosive growth curve.”
The roadmap report also warns that beyond 2020, it will become economically unviable to keep improving performance with simple scaling methods. Future improvements in computing power must come from areas not related to transistor count.
“That wall really started to crumble in 2005, and since that time we’ve been getting more transistors but they’re really not all that much better,” said computer engineer Thomas Conte from Georgia Tech for IEEE Spectrum.
“This isn’t saying this is the end of Moore’s Law. It’s stepping back and saying what really matters here – and what really matters here is computing.”