Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Chinese artificial intelligence upstart DeepSeek proved crunching vast amounts of data need not guzzle so much energy. Elsewhere, GenAI is on a more energy — and capital — intensive track. But there are ways for the rest of the world to cut back too.
Data centres, the back-office workhorses for AI, data storage and the like, only sucked up some 1.5 per cent of total energy consumption last year, according to the International Energy Agency. But it projects that usage could more than double to 945 TWh by 2030 or, under one scenario, exceed Japan’s entire consumption.
Once popular co-location deals, where data centres tap in directly to their energy sources, are causing some watchdogs to fret about the knock-on effects on other users. Amazon is fighting to increase power at its data centre directly linked to a Talen Energy nuclear plant in Pennsylvania. There are also moves to switch from dirty sources of fuel to renewables and cleaner sources. Amazon and Google are among those investing in small modular nuclear reactors.
Sure, some of the activity in data centres is devoted to, say, game-changing medical research. But much — the enormous amount of energy ChatGPT used to turn users’ selfies into Studio Ghibli-style cartoons, for example — is of less certain value.
Broadly, energy consumption falls into two buckets: computational power, including storage and web hosting, and stopping the infrastructure from overheating. An estimated two-fifths of data centres’ energy goes on cooling systems, which are also water intensive.
Plenty of wizardry is going into reducing the latter. These include using different materials, repositioning fans and switching from (expensive and wasteful) air cooling to more direct methods. One example: “direct to chip”, where liquid circulates through a cold plate connected to the power-dense processing units. Those working on supercomputers and data centres are also getting savvier about recycling the heat generated by their kit, channelling it to nearby breweries and refineries.
On the compute side, optimised programmes, improved coding and algorithms can all work towards more efficient power use. Progress means chips are less power-hungry too. The IEA notes that Nvidia’s B200 graphic processing unit is 60 per cent more efficient in terms of Flop/watt — basically, how much bang a user gets for their wattage — than the previous generation’s H100.
Of course, using less power means more usage, which once again ups the power take. But data centres have every incentive to bring down their power bills. Besides, taking the strain off computation has pedigree. During the second world war, women stepped in as human calculators to preserve power on the early-iteration computer working to develop the first nuclear weapons. Intelligence does not always have to be artificial.
louise.lucas@ft.com