Akamai Diversity
Home > Sustainability > Are Data Center CRACs Going the Way of the Ice Box?

Are Data Center CRACs Going the Way of the Ice Box?

Cooling systems represent about seventy percent of a data center's total non-IT energy consumption.  Eliminating cooling mechanicals, e.g., CRAC*s and chillers, would be a significant step towards major energy and cost savings when you consider that many data centers consume hundreds and thousands of kilowatts of power - oodles more than office space.  But in regions with hot and/or humid climates isn't mechanical air conditioning a necessity to keep IT equipment humming? 

try again.PNG

Not so anymore according to The Green Grid (TGG).  At The Green Grid conference in early March TGG announced its updated free-cooling maps based on the new American Society of Heating, Refrigeration, and Air conditioning Engineers (ASHRAE) guidelines for allowable temperature and humidity ranges for various classes of IT equipment.  These maps show where in the world and for how many hours per year outside air can be used for cooling ("air-side economization").  There's a lot of discussion in the whitepaper about dry bulb and dewpoint temperatures and psychrometric graphs that I won't bore you with.  The net-net is depending on the ASHRAE classification of IT equipment in use, A1-A4 with A4 being the most heat- and humidity-tolerant, free-cooling can be used year round in 75%-100% of North America and  greater than 97% of Europe, even with temperatures as high as 113°F!  Japan's environment is more challenging at 14%-90%.   The maps below, reproduced from TGG's whitepaper, show free cooling ranges for the more delicate A2-classified IT equipment.  In full disclosure, to achieve 100% free cooling in some locations, operators must be okay with occasional incursions into heat and humidity ranges outside the recommended ARSHRAE ranges.  But when one crosses the infrequency and short duration of these incursions with the risk of failure of the IT equipment and compares against the CAPEX and OPEX savings of doing without mechanical cooling, it's certainly worth a look.  

Still not convinced?  Big data center operators aren't waiting to put theory into practice.  eBay is operating its new Phoenix data center with 100% free cooling year round even during 115° days!  And Facebook's state of the art Prineville data center in Oregon was built for only free cooling.  I know, they deploy masses of servers on a monthly basis and don't have voided equipment warranties to worry about.  But most technology refreshes happen on a three year time frame, not too far off to assess your free cooling options for the next planning cycle.  And consider that just turning on air-side economization was found to save an average of 20% in money, energy and carbon.

* CRAC = computer room air conditioner

EMEA map.PNG
Figure 2a.  Free cooling map for Europe for A2 classified IT equipment based on updated ASHRAE guidelines.


USA map.PNG
Figure 2b.  Free cooling map for U.S. for A2 classified IT equipment based on updated ASHRAE guidelines.


Japan map.PNG
Figure 2c.  Free cooling map for Japan for A2 classified IT equipment based on updated ASHRAE guidelines.

Source:  The Green Grid

Nicola Peill-Moelter is Akamai's Director of Environmental Sustainability

1 Comment

Cooling systems have become an integral part of the society around the world, most especially in humid countries nowadays. In businesses where there are a large number of computers being used, it's important that the cooling system is good enough to ensure that these computers and other devices won't be overheating. The downside of this is that energy bills will become really high. I think the idea of free-cooling is amazing. This will surely help conserve the energy and lower cost of energy bills.

___________
Absolute Cooling Service
Freezer Rooms and Display Freezers Adelaide

Leave a comment