The Hot Aisle Logo
Fresh Thinking on IT Operations for 100,000 Industry Executives

Today we are going to look at practical changes that we can all make to our data centres that will improve cooling efficiency, to enable us to install more equipment in the same space or reduce our electricity bills.

There are a few basic rules that will help us become more efficient.

  • The first is the less we mix hot air from computer cabinets with the cold air coming from your Computer Room Air Conditioning (CRAC) units, the more efficiently we will cool your data centre.
  • Secondly CRAC units typically draw hot air from the upper part of the computer hall, cool it and blow the air under the raised (plenum) floor. We should be careful in placing open tiles near CRAC units to avoid short circuits in airflow.
  • Thirdly computer room cabinets are designed to airflow front to back so it is important to ensure that we have open tiles in front of cabinets to supply cold air from under the raised floor. We should avoid putting open tiles at the back of cabinets as this encourages hot and cold air mixing.
  • Fourthly, try to organise your data centre rack layout so that adjacent rows are organised front-to-front and back-to-back. This is known as a hot aisle / cold aisle arrangement. It helps to separate hot and cold airflow.

So where can we start to make our existing data centre more efficient and green? The first is maintaining the floor, too often a task that is ignored. Gaps and unnecessary holes in the floor are bad news and lead to the mixing of hot and cold air that drops cooling efficiency.

Loose tiles can cause leakages (and may be a health and safety issue) so should be replaced. Keep the floor maintained, replace tiles with holes, close open vents in the wrong place. Better still take out all unused vented tiles and replace them with plain tiles. Even closed vented tiles leak air.

You should test the soundness of your floor by pressure testing and checking that all cableways that lead between halls are properly blocked off. This is important for a number of reasons; it reduces cold air leakage, slows the spread of fire and smoke within the facility and discourages rodent entry.

Cold air leakage from below the floor can make a significant difference to data centre energy efficiency. Cable entry points into cabinets need to be closed off as well, as this can offer an alternative air flow route from cold to hot avoiding the equipment we need to cool. Fit raised floor grommets or beanbags around the holes to close off excess airflow.

Blockages to airflow under the raised floor reduce airflow efficiency and can cause hotspots in the data centre halls. Keep extra rolls of cable and other junk either in a store cupboard or at worst above floor level. Under-floor blockages can divert airflow away from key areas.

Counter intuitively you should ignore areas of the data centre that are hot – actually this can be a good thing – provided that these are not located in front of cabinets. Hot spots in front of cabinets are bad as this means that the equipment will be drawing in hot air, reducing the cooling efficiency.

Hot spots in front of cabinets generally mean that the rack layout is poor with cabinet backs facing cabinet fronts. Look to implement hot and cold aisle separation.

Resist the temptation to place open tiles in hot aisles, the “advice” that this lift the hot air up so that it extracts efficiently is utter nonsense and should be ignored. Hot aisles are designed to be hot and are less efficient if they have open tiles in them. Don’t worry about the humans; they can survive high exhaust temperatures for long periods.

Poor airflow from open tiles can be caused by a number of problems. Too many open tiles reducing the plenum pressure overall, blocked underflow caused by overflowing cable trays or stored junk. Badly maintained refrigeration units or blocked filters can reduce airflow as well reducing plenum pressure.

If designing a new hall, keep the cable trays above ground, it can make a huge difference and is much easier to manage. The photograph below is from the iNet Data Centre in Milan and shows a great arrangement.

If you really need to have high density equipment in a hall you might consider hot aisle containment so that you can exhaust the air from that equipment straight out to the atmosphere, bringing in fresh air to replace. An example is shown below.

Maintaining strong separation between hot and cold aisles is particularly important. Even when implementing hot aisle and cold aisle separation, about 40% of the cold air short circuits past the equipment it is supposed to cool. Blocking off open gaps in cabinets with blanking plates is key as is fitting empty (and fully blanked) cabinets into gaps or alternatively filling full-length curtains in gaps.

The photograph above was taken in BT’s Rochdale Data Centre. It shows the use of curtains to separate the hot and cold aisles, thereby reducing the amount of hot air that mixes with cold air from the plenum floor and dramatically increasing cooling efficiency. Inside the curtained off space are all of the open vents from the plenum floor and all of the cabinet fronts. There are no open tiles in the hot aisles.

In combination with the curtains, each cabinet is fitted with blanking plates to ensure that the only path for cold air to pass is via a piece of equipment. Missing cabinets and missing equipment offers an alternative route for airflow that causes hot and cold air to mix reducing air handling efficiency.

If you can’t fit blanking plates – move the equipment to the bottom of the cabinet, remember it is hot air that rises and the air is likely to be cooler at the bottom of a rack than at the top.

The delta in temperature between the air inside the curtained off area and that outside is quite dramatic, being in the range of 10° to 15° Celsius. Refrigeration efficiency is increased by approximately 20%.

Installing curtains and organising cabinets and open floor tiles in a sensible way is something that everyone can do regardless of the age and condition of their data centre site. If the site is constrained by cooling capacity, the installation of curtains will allow more equipment to be installed. As a minimum the energy bills will drop markedly.

An important fact is that using fresh air to cool computer equipment is more common than you would think. I have examples of where this is commonplace:

Many vendors expect their equipment to be used in office environments without raised floors and dedicated air handling units. There are hundreds of thousands of small companies who have a few servers installed in a cupboard or in the corner of the office, all running at ambient temperature. The big difference between equipment located in an office space and in a dedicated data centre is packing density, the number of KW per square meter.

Telecommunications companies have used fresh air-cooling for decades for their network equipment in telephone exchanges. Vendors manufacture most of their computer products to operate in this environment. Telecommunications companies specify operating temperature ranges up to 50 Celsius (122 F). Telecommunications companies expect power densities of 0.5 – 1.5 KW per square meter.

At BT we constructed 106 21st Century Data Centres that work on fresh air-cooling. These are not design studies but real live sites delivering service to customers.

How practical is fresh air-cooling? Depending on the location of your site and the maximum year round temperature the answer is very practical. But in almost any location, it is possible to reduce the amount of air conditioning required provided that the external temperature is cooler than the temperature of air exhausted from your equipment for at least part of the day.

Manufacturers specifications are generally for the air temperature at the inlet to the computer equipment not the temperature inside the computer cabinet. Note that higher inlet temperatures tend to cause the temperature-controlled fans inside most equipment to operate at a higher speed, increasing the energy used. Provided that exhaust air from the hot aisle is not allowed to mix with fresh air in the cold aisle and that room air handling systems can deliver enough air keeping the inlet temperature well below 40 Celsius.

Humidity control is designed to prevent condensation on cold surfaces (80%) and the build up of static (10%). The absence of refrigeration means that there are no surfaces colder than the airflow making condensation impossible. As input air is warmed going through the room, relative humidity decreases further reducing the likelihood of condensation. Very dry input air is unusual in a temperate climate, usually only caused by refrigeration.

When installing curtains to separate hot and cold aisles it is important to consider the impact fire detection, fire suppression and personnel evacuation.

Lets start with the impact on fire systems of having either hot or cold aisle containment. The safety of personnel working within the data centre is absolutely critical and there are a number of key issues to be concerned with:

  1. Are (audible or visual) fire alarm signals blocked out by the containment system? Can you hear and / or see that an alarm is going off? This needs a live test with db monitoring equipment to ensure that all building and safety regulations are being met.
  2. Will the gas discharge properly distribute throughout the room with the containment system in place? Tests at Rochdale showed that the air circulated around the room very quickly (3 times a minute) and that all gas was evenly distributed within 30 seconds. For all practical purposes the containment system made no measurable difference to the effectiveness of the fire suppressant system. As long as the containment system does not interfere with the gas discharge nozzles it does not really matter if they are located in the containment system, outside the containment system or indeed under the floor void or in the roof space. Where there is an element of free air (fresh air) cooling gas discharge fire suppression is not particularly effective.
  3. Could the pressure wave from the gas discharge dislodge the containment system and cause injury to occupants? This is a risk with any gas suppression system, where ceiling tiles, floor tiles, cabinet doors and other objects can be dislodged by the gas pressure wave. In all cases it is important the system is not armed when personnel are inside the protected area and that emergency cut off switches are provided in convenient locations.

Numerous data centers employ hot-aisle/cold-aisle containment, including Sun Microsystems. Storage vendor NetApp has employed vinyl curtains similar to those you see at BT to contain the air in the hot aisles in its Silicon Valley data centre. These curtains alone save NetApp 1 million kWh of energy per year.

Yahoo Inc. has also employed vinyl curtains in one of its data centres for airflow containment.

Beyond the fire-suppression issue, hot-aisle/cold-aisle containment seems like a no-brainer and the next evolutionary step for data centre cooling strategies.

 

  • Pingback: What can Data Center managers do for themselves to reduce energy consumption? | The Hot Aisle

  • http://www.simplexisolationsystems.com Tim Sunderland

    Simplex Isolation Systems provided the curtains and strip doors which were used by NetApp to meet their goal of saving 1 million kWh of energy per year. The system uses fuse links for mounting the curtains. Fuse links work on the same technology as fire sprinklers. Sprinklers are held in the closed position by a fusible alloy wire. When the fire reaches a predetermined temperature, the alloy melts and opens the valve for the sprinkler. By using fuse links to attach the mounting hardware of the curtains and strip doors in the data center, Simplex is able to ensure that the curtains will fall away and the sprinklers will be fully operational.

    Simplex has specially formulated vinyls which meet the standards of the California (United States) State Fire Marshall, the requirements of American Society for Testing and Materials (ASTM), and the National Fire Protection Association (NFPA).

    Regarding the aesthetics of these systems, Simplex has also installed several systems using hard wall partitions. For data centers that are in high visibility locations and receive a good deal of VIP visitors, these panels are an ideal solution. They do require additional add-ons so that the fire sprinkler system will meet approval.

    Also keep in mind, at least in the United States, that many electrical utilities provide a rebate for installing these systems. When you ad the rebate from Pacific Gas & Electric (PG&E), the NetApp system paid for itself in about five weeks. You can see the Simplex data center curtains and partitions at http://www.simplexstripdoors.com/dccurtains.htm.

    Tim Sunderland, Marketing, Simplex Isolation Systems.

  • chouse

    The SubZero 'Mini-Cube' cable cut-out cover beanbags can be found at http://www.midwestcomputerit.com/SubZeroMini-Cu… and the KoldLok Floor Grommets are at http://www.dataclean.com/koldlok-raised.htm

  • http://blog.dssdatacenter.com/ Scott Kantner

    Great Post! Only the simple practical ideas are going to get executed. Major league green initiatives are just too costly.