Across the globe, data centers are stacked to the ceiling with shelf after shelf of humming servers. The collective number of operating CPUs is taking a toll on data centers’ energy bills, but the real culprit driving up the facilities’ energy costs may actually be their thermostats.
Like humans, computers work best in a small temperature window roughly between 18 °C and 27 °C, with the sweet spot being about 24 °C. Data centers are on track to require an estimated 848 terawatt-hours by 2030, and up to 40 percent of that total will go toward cooling alone.
Small improvements in energy consumption can be eked out by improving server operation efficiency. However, some experts believe that drastically changing how data centers are kept cool—or even, warm—may be the solution.
A paper published on 18 October in Cell Reports Physical Science and another presented at the 2022 International Electron Devices Meeting conference present two very different visions: One in which data centers are kept at a sweat-inducing 41 °C, and another in which they’re cooled down to an inhospitable 100 kelvins (roughly -173 °C), respectively.
The Case for Cryogenic Cooling
Arnout Beckers and Alexander Grill are coauthors on the cryogenic cooling paper. Beckers is an engineer and Grill is a researcher, both at Belgian nanoelectronics and digital technologies company Imec. They explain that cryogenically cooling a data center would not mean turning the whole building into an ice cube. Instead, the idea makes use of extremely cold and nonconductive liquids, like liquid nitrogen, to cool server systems by immersing them in the liquid.
“The main difference is in the cooling with liquids instead of air,” Beckers and Grill write in a joint email response. “Liquid-immersion cooling is already a trend coming to data centers, but with liquids above ambient temperature.”
At these extremely cold temperatures, computing systems can see increases in efficiency as a result of reducing obstacles like mechanical resistance and transistor switching. Yet Beckers and Grill say that cooler isn’t always better. For example, cooling these classical servers down to the temperatures needed for quantum computers (1 kelvin or -272 °C) wouldn’t make the computers hyperefficient.
By bringing servers’ temperatures down through cryogenic cooling, Beckers, Grill, and their coauthors argue that data centers could see a 16-fold increase in computational performance—partially offset by a 4-fold increase in the energy used to power the cooling system.
“In a cold data center, most of the energy will go to the cooling, and only a small fraction will be for compute. The aim is to lower the compute energy as much as possible to maximize the net benefit,” Beckers and Grill write.
The Case for Warm Data Centers
Rakshith Saligram is a graduate student in electrical and computer engineering at the Georgia Institute of Technology whose research focuses on cryogenic computing. He…
Read full article: Should Data Centers Be Kept Cool—Or Warm?
The post “Should Data Centers Be Kept Cool—Or Warm?” by Sarah Wells was published on 11/16/2023 by spectrum.ieee.org