Cooling systems meet the challenge of power-dense data centres
May 28, 2019 by Jillian Morgan
It is no easy task to cool power-hungry data centres. Operators rely on efficient and cost-effective cooling systems to combat climbing power densities, which can reach upwards of 50 kilowatts per rack – the physical structure used to house the servers that collect, process, store and disseminate data.
Data centre energy consumption has escalated considerably over the last decade, largely due to the rise of artificial intelligence, machine learning, 5G and the Internet of Things (IoT), to name a few. In response, HVAC manufacturers have unveiled innovative cooling systems equipped to meet the unique needs of those facilities.
For Chris Sharp, chief technology officer of Digital Realty, densification (the move to increase capacity without the added real estate) is not a matter of “if” but “when.”
“One of the things that’s challenging out there is that, in a lot of the workloads or architectures that are coming into the data centre today, there’s varying levels of power density demand,” he says.
In June 2018, the California-based company opened the doors to TOR1, a 711,000 square foot data centre in Vaughan, ON–its third in the Greater Toronto Area. The facility houses 23 computer rooms ranging from 8,600 to 13,000 square feet, equipped to accommodate power capacities between one and three megawatts per room.
Traditionally, power densities would sit at around four kilowatts per rack, according to Sharp. Today, that requirement has jumped to the 12- to 30-kilowatt range and higher.
“To be able to do that without stranding capital, particularly with the different types of cooling technologies that are out there, is absolutely paramount to us and to supporting our customer demand,” he says.
KEEPING IT COOL
The heat emanating off high-power racks can match that of a sizeable oven or boiler, according to Mat Hery, former product manager of ADHX-35B and CDU1200 at ServerCool – a brand under Nortek Global HVAC, a subsidiary of Nortek Inc. – which are now managed by Doug Garday, principal product engineer at ServerCool.
The brand’s liquid cooling products, which include cooling distribution units and rear door heat exchangers, have been used by IBM, Fujitsu Ltd. and Lenovo Group Ltd., to name a few.
“Because they are so powerful, they give off so much heat that conventional ways of cooling these servers would be very inefficient and very costly,” Hery says. “These guys have to use liquid cooling. It’s the only way you can actually cool down a server that gives away two and a half kilowatts of heat.”
Nortek Air Solutions LLC, another subsidiary of Nortek Inc., announced in September 2018 that its StatePoint liquid cooling technology would be used in Facebook’s first data centre in Asia.
The system, jointly developed with Facebook engineers, uses a liquid-to-air exchanger in which water evaporates through a membrane separation layer to cool the data centre.
“You can change about four times more heat with water than with air,” Hery says. “In a sense, your facility is just going to be four times smaller and four times more dense and four times more compact, so you save on real estate, you save on [cooling] costs, you save on energy efficiency.”
For Sharp, maintaining cool temperatures in energy-intensive data centres demands a fresh approach.
“As we progress forward with the densification of power, you’re reaching the end of air-based cooling and we’re really starting to pick up into that liquid cooling technology,” he says.
Digital Realty’s facilities support in-row liquid cooling and rear door heat exchangers, which Sharp says allows the company to not only facilitate existing work loads but also “easily flex into the more power-dense requirements.”
“This is something that’s core to every data centre operator… finding that right balance,” he adds. “The mechanical and cooling infrastructure represents the majority of the expenditure… So balancing customer demand without stranding capital, particularly in supporting different power density levels, is absolutely paramount to our success.”
TAKING UP SPACE
While liquid systems have gained considerable traction, a number of air-based data centre cooling technologies have emerged in recent years.
FabricAir Inc., based in Lawrenceville, GA, offers a fabric ducting system designed to distribute air more evenly at a higher velocity compared to its metal counterparts. The company’s technology has been used in server rooms across Canada.
When it comes to air-cooling, Chuck Justice, the company’s vice-president of sales and marketing, says maintaining proper temperature and distribution is essential to the longevity of the servers.
“With a fabric duct system, you’re able to deliver the temperature of the air at the right velocities throughout the entire space,” he says. “Fabric ducting has very little static pressure drop within the duct. In other words, with metal duct work you have obstructions in the system… which create added energy that’s used to overcome those drops in pressure differential.”
The reduced static pressure in the fabric ducting means the air is delivered with less energy, according to Justice. Though, he says the system also offers other benefits, namely installation flexibility.
“You’re not just sharing that space with ductwork,” he says. “One of the biggest challenges is being able to get a system that’s able to navigate through that network of cables that’s up there and everything else that’s shared in the ceiling.”
Looking ahead, Justice says the future of air-cooling in data centres will shift to under floor air distribution systems.
“If you can think about an office building, above that grid on the ceiling there’s the open space–that’s where all the ductwork runs and the cabling and so forth. With a under floor system, you basically take that and [put it] underneath where the servers lie,” he says.
PULLING IT TOGETHER
Data centres are not just looking to save on energy.
Building a data centre from the ground up, or retrofitting an existing building, takes time. For Sharp, taking a modular approach – in which the cooling systems are built off-site and delivered on skids – allowed Digital Realty to meet customer demand quicker.
“To meet the timelines of building that and pulling that together, the skids were a critical component for us,” Sharp says. “I think that translated not only to the electrical but also the cooling systems. As a contractor, you should really think about how you can help data centre builders think about modularity and the containerized elements to help them meet their timelines.”
As data centres become more ubiquitous and powerful, mechanical contractors and HVAC manufacturers will play an essential role in keeping these facilities cool.
“Data centre energy footprints are increasing every year,” Hery says. “It’s quite an energy hungry industry and our mission is to help that energy and that inefficiency be reduced.” <>
Print this page