When it comes to maintaining data centres, the biggest energy cost is the energy expended in cooling.
CCN talks to Damien Spillane, head of sales engineering at data centre provider Digital Realty Australia, about cooling cost savings.
As the author of a Digital Realty white paper that examines data centre cooling, Damien Spillane has identified a range of opportunities to improve efficiency.
Spillane says cooling is one area with the potential to deliver significant energy savings.
In fact Google identified it as the single biggest opportunity for efficiency that is “greater than all areas combined”.
Traditionally, data centre temperatures have been moderated by precision cooling topologies that expend a lot of energy, Spillane says.
“Hence, the focus has shifted profoundly toward the machines and systems charged with monitoring and maintaining the temperatures and humidity levels within data centres,” he says.
Prior to 2011, using free air to cool data centres wasn’t a viable option, but in that year the ASHRAE expanded the environmental range for data centres.
Back then, the temperature range was particularly narrow, between 20 and 25°C and relative humidity was between 40 and 55 per cent, Spillane says.
“This meant that only precision cooling systems could be deployed in sealed computer rooms. Invariably, these systems involved direct expansion (DX) cooling with computer room air conditioning (CRAC) units that rejected the heat via compressor-drivenrefrigerant circuits, or chilled water (CHW) computer room air handling (CRAH) units that rejected the heat via a chiller/cooling tower.
“Little consideration was given to using low ambient conditions to reduce energy consumption. This tight window of environmental control was driven by IT equipment requirements and to maintain the reliability of rack-mounted equipment.
“Broadening the temperature and humidity window was a milestone change.”
Spillane says ASHRAE’s recommendation has enabled the deployment of more efficient cooling topologies such as free air cooling, and has helped lower energy consumption and the overall cost of data centre operations.
“Now the new humidity window can be coupled with outside air, albeit heavily filtered outside air,” he says.
In the right climatic zones, it is possible to build data centres that rely almost entirely on free cooling.
The use of air-side economisation (and water-side economisation with a cooling tower) versus dry-cooler type water-size economisation also increases the number of available locales for chiller-less facilities across the globe.
To deploy free air cooling in a data centre, Spillane says the key starting point is comprehensive climate analysis of the proposed site to ensure the environment is suitable.
The availability and commercial availability of water is also a consideration when choosing a free air cooling strategy, particularly when looking at adiabatic cooling.
Spillane says Australia is categorised as a high risk area for water deficiency.
A lifecycle analysis found using water for heat rejection in Australia is not viable, especially when compared to the advantages of free air cooling, he says.
Next, a site’s microclimate is assessed, which is affected by proximity to the sea or undulating terrain. Digital Realty completed these assessments in Sydney and Melbourne.
“Using outside air to cool data centres requires planning and should be designed into the master plan of the facility,” Spillane says.
“This is due to the fact that the cooling medium is air delivered solely by large air-handling units with integrated economisers, which should be in close physical proximity to the data centre space.
“These large air handling units can be mounted on top of or alongside the data centre space, and although the overall spatial requirement is lower than legacy cooling systems, it does require the closer proximity.
“Because of this, retrofitting free air cooling technology to a legacy site can be prohibitive and complicated.”
There are a number of mechanisms employed to deliver free-air cooling to data centres, including economisers, indirect free air, thermal wheels and plate-heat exchangers.
After assessing all of the mechanisms and options, Spillane says the direct system presents a compelling case due to the efficiency gains.
“There is no heat exchange loss through the heat exchangers or thermal wheels, and additional fan pressure drops plus the simplicity of the system means there are fewer moving parts so there are fewer problems,” he says.
“Rooftop air handling units are equipped with standard mechanical components to enable the delivery of cooled air, which greatly simplifies the operating principles of the units.”
“Air is supplied to the data hall via the supply air fan, and returned via the extract fans.
“The units will use ambient air when they can, and the economiser cycle will operate at the most efficient mode possible, ensuring an optimum power usage effectiveness (PUE).”
Spillane has identified key areas of focus during deployment, including the efficiency of the fans.
“Given that the fans are constantly running and essentially replace the DX cycle as an operational base load, ensuring they operate efficiently at all ranges and use variable-flow technology is vital,” he says.“Also with this technology, designing and maintaining the filtration on the air handling units is vital.
"It is also important to ensure that the air flow from the roof mounted units is delivered to the white space with as little pressure drop as possible, but also in a way that enables the flexibility to deploy high density racks where required.”
Another key area is monitoring and sensing, and a comprehensive system of decentralised controls, unit cycling, multiple redundant sensing locations (including weather stations) and air quality analysis will ensure the system is autonomous and can operate at maximum efficiency in all conditions.
“Using outside air to cool data centres can offer significant advantages such as driving down total cost of occupancy (TCO), improving PUE and can also lead to significant energy savings by eschewing traditional cooling methods such as chillers or compressors,” Spillane says.
“Digital Realty has found that PUEs of less than 1.3 are achievable with zero water consumption.
“This is achievable because full and partial free air cooling can be deployed for a large portion of the year. For Sydney its 76 per cent and in Melbourne its 91 per cent.
“Another major advantage is the ease of operation associated with free air cooling systems versus traditional chilled water deployments.
“Using free air to cool data centres is a risk-averse, cost-efficient and energy-efficient solution for companies looking to minimise the carbon footprint of their data centre deployments.”
What is PUE?
Power usage effectiveness or PUE is a measure of the efficiency of a data centre’s mechanical, electrical and ancillary systems in delivering power to its IT equipment. It is equal to total incoming power to the data centre, typically rendered in kilowatt hours (kWh) or kilowatts (kW) divided by the power supplied to the racks.
Based on a 2013 independent Campos study of the Australian data centre market, the average PUE was reported as 2.25, which means that for every kW delivered to the rack, 1.25kW was lost in cooling and system efficiencies.
Put simply, 2.25 = 2.25kW/1kW
One solution to the high cost of cooling is free air, which can lead to a PUE of less than 1.3. Data centres with PUEs of less than 1.3 enjoy a 42 per cent reduction in energy consumptions over data centres with PUEs of 2.25.
Moreover, countries across the globe are relying on regulations and guidelines to improve energy efficiency. Last year Australia introduced NABERS for data centres, which benchmarks the energy performance of a data centre with infrastructure measurements based on an annual PUE.
Consolidation in itself does not guarantee a reduced carbon footprint, but leveraging the increased scale of power and cooling required for these large facilities by applying innovative design principles, installing dedicated systems and implementing highly specialised operations can generate significant savings, Digital Reality research has found.
Digital Realty carried out a total cost of occupancy (TCO) study on the commercial benefits of consolidating from an existing office data centre to
an outsourced/turnkey environment.
Based on a data centre that is 1000 square metres, 5kW per rack and has a PUE of 2.25, the study found that potential savings of up to 30 per cent were achievable, largely based on the energy saved by using more efficient cooling architecture where a turnkey data centre (with a PUE of 1.3) was utilised.
About Damien Spillane
Damien Spillane is the head of sales engineering for Digital Realty Australia. He is responsible for helping customers and their consultants in their data centre requirements with a remit that covers sales, design, construction and operations.
He joined Digital Realty in 2011.With over a decade of data centre experience, Spillane has worked as a design consultant, main contractor and client representative in Europe and the Asia Pacific market.
Before joining Digital Realty, he acted as the functional head for a global data centre engineering consultancy, leading a team of more than 20 engineers.