← Back to KHAO

AI ·

Delivering this much power in AI data centers is a challenge, but common components are addressing the need

2 min read

Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.

◌ Single Source

Switch Pyramid.

However, all this power turns into heat and removing that heat is no longer possible with traditional air cooling.

Key facts

Summary

The growth of graphics processing unit (GPU)-based accelerated computing that powers AI workloads is changing the data center architecture. This level of power consumption is driving up a common metric in their industry – power consumption per rack. Delivering this much power in AI data centers is a challenge, but common components are addressing the need. Liquid cooling comes in various methods, but direct liquid cooling (DLC), also known as direct-to-chip, has become the preferred technology to cool these chips. However, deploying direct liquid cooling at scale in AI data centers is new and introduces more complexity into a complex environment.

Read full article at Datacenter Dynamics →