Liquid Cooling Gaining in Popularity Again

The idea of cooling your data center equipment using any type of liquid might sound like an absolute non-starter. Lunacy in fact. But thanks to spiraling energy costs, corporate green initiatives and new high-tech coolants, the concept of liquid cooling in the data center is enjoying a renaissance.

Liquid cooling actually goes back a long way in the computer industry: The first computer designed to be cooled by liquid was almost certainly the Cray-2 supercomputer back in 1985, using an extremely environmentally-unfriendly coolant called Flourinert to dissipate heat. More recently water based cooling systems have been used by hardcore gamers, enabling them to overclock their gaming rigs without causing them to melt.

Now a number of new companies have sprung up with solutions that use new coolants. These liquids do not conduct electricity so they can be in direct contact with electronics without causing any damage, and since they are many times better than air at capturing and transporting heat they offer the prospect of dramatically more energy-efficient cooling than is possible using the conventional chilled air approach.

The heat captured by the coolant is generally transferred to a water pipe loop using a heat exchanger, and the heated hot water is then pumped out of the data center where the heat can be dissipated into the air using a radiator. But it doesn’t have to be wasted. The waste heat can also be used to warm office spaces or to provide hot water in a building, reducing corporate energy costs further.

The potential benefits of liquid cooling in are significant. For starters, a suitably designed system can capture almost all of the heat generated by a server’s components, so there is no need to power internal fans to assist airflow. That in itself can reduce server power consumption by about 30 percent. But the main savings come from reduced air cooling costs. Since heat from the servers is captured by the coolant and removed without warming the air around the server racks, there is little or no need for computer room air conditioning (CRAC) equipment. And since the electricity needed to power CRAC equipment, chilling plants and other cooling equipment may account for as much as 30 percent of data center running costs, the potential savings are enormous.

But there are other potential benefits of liquid cooling, as well. One major benefit is reduced space. Server racks can be packed much more densely without the need for hot and cold aisles, because good airflow is no longer necessary. Some vendors also claim there is the potential to overclock servers, in the way that gamers do, without introducing reliability issues because liquid cooling is so effective. If you are unwilling to risk overclocking then, at the very least, it is reasonable to expect that components that are cooled more effectively should also last longer and prove more reliable. And since liquid cooling is almost silent, unlike CRAC equipment, data center noise levels can be significantly reduced. This makes it easier to comply with Occupational Health and Safety Administration noise regulations, and almost certainly obviates the need for staff to use ear defenders when working close to the server hardware.

Added benefits (and drawbacks)

The design of liquid cooling systems varies, but in systems where the coolant is in direct contact with components this acts as a fire suppression system. That’s because the coolant is inert and components are no longer exposed to air. This also removes any potential corrosion problems due to air quality when data centers are located close to salty sea air or where there is excess humidity.

The one significant drawback to liquid cooling systems is that most can only be used with special hardware (usually based on standard components) supplied by the cooling system vendor or its partners. The rest can only work with existing server hardware after it has been modified by the vendor.

Hardcore Computers, a Minnesota based systems manufacturer, offers a liquid cooling system called LSS (Liquid Submerged Server) 200 which works by pumping a coolant it calls Core Coolant through sealed server cases so that all the internal components are submersed in the liquid. The servers, which are based on Intel Xeon processors and use solid state drives (SSD) for internal storage, are priced at a slight premium to conventional servers with the same specification, according to Chad Attlesey, the company’s founder.

But the system enables significant energy costs savings, he said. “We can cut your data center power consumption in half because of the reduction in air conditioning and air moving equipment.”

A full rack of servers can require 10kW to power the server fans alone, while three racks can be cooled using liquid cooling using a single 200W pump, without the need for CRAC equipment, he said. A liquid cooling system could pay for itself in as little three years, said Attlesey.

A variation on this system is about to be launched by Iceotope, a UK based vendor. The company’s Iceotope Platform server cabinet can be filled with up to 48 hot-swappable sealed servers filled with a 3M engineered coolant called Novec. The difference from Hardcore’s system is the coolant remains sealed inside the servers and is then cooled using water that flows in a loop on the outside of the case.

Peter Hopton, Iceotope’s CTO, said that Iceotope servers consuming 300kW of power would normally require 150kW of power for cooling. Instead, the water pump uses about 1KW. He also said that servers cooled in this way will prove more reliable.

“The cooling is so uniform inside that components have almost no variation in temperature, so there is no thermal fatigue.”

The system will be priced in line with comparable air cooled systems so that pay back will be achieved from energy cost savings.

Asetek, a California based company, has recently unveiled its Sealed Server Liquid Cooling systems which uses both air and liquid powered by its Rack CDU (Coolant Distribution Unit.) Each sealed server in a rack contains a pipe loop containing coolant. The pipe cools two or more cool plates that are placed over the CPUs and (optionally) memory, and also cools the air in the sealed server which then cools the rest of the server’s components.

The company estimates that a full rack would draw 21kW, requiring 7kW to power computer room air conditioning. Its liquid cooling would require around 3kW per rack, resulting in a saving of about $3500 per rack per year at $0.10 per kWh. The company’s Rack CDU is priced to achieve a one year payback period

Green Revolution Cooling, a Texas-based based company, offers probably the most radical departure from traditional data canter cooling. Its CarnotJet system is based on the concept of dunking — literally placing an entire server rack into a tank of its GreenDEF coolant. The system can be used with almost any server as long as various modifications are carried out first. These include sealing hard drives (as these can’t function properly when immersed in liquid,) removing internal cooling fans, and replacing any thermal compounds with ones that won’t dissolve in the coolant. The company said CarnotJet can cut cooling energy use by 90 percent and offer a payback period of between one and three years.

Since liquid cooling solutions need new or modified server hardware this does limit their appeal. But, for enterprises planning a hardware refresh or a new data center build out, investing in liquid cooling could be a sound financial investment which reduces energy bills and enhances corporate green credentials.

Paul Rubens has been covering IT security for over 20 years. In that time he has written for leading UK and international publications including The Economist, The Times, Financial Times, the BBC, Computing and ServerWatch.


Paul Rubens
Paul Rubens
Paul Rubens is a technology journalist specializing in enterprise networking, security, storage, and virtualization. He has worked for international publications including The Financial Times, BBC, and The Economist, and is now based near Oxford, U.K. When not writing about technology Paul can usually be found playing or restoring pinball machines.

Latest Articles

Follow Us On Social Media

Explore More