The Hot Aisle Logo
Fresh Thinking on IT Operations for 100,000 Industry Executives

There are a lot of them around, Data Centres. A few of them are designed and operated very well and deliver great Power Usage Efficiency. Some could do a bit better, perhaps an airside economiser or two, or some hot or cold aisle containment, or maybe some DC power. Some are just a nightmare and could benefit from the administration of a wrecking ball.  For some data centres, it seems that no amount of fixing them up, improving plant and applying best practice will make any measurable difference. Let’s call them clunker data centres! (Maybe we can get the Government to do a cash for clunkers program for data centres?)

A clunker starts off with a ceiling height that is too low for hot air to separate out and migrate towards the CRAC units without too much mixing. The plenum under the raised floor is shallow and clogged up with cables and other detritus choking off airflow from the CRACs. The floor tiles are perforated and have low airflow characteristics. The cabinets are all lined up like a schoolroom. front to back to front to back….  You could cook turkeys in the back row. The CRAC units are low capacity and that capacity is exhausted. Naturally the boss wants you to install some 10KW racks in a hurry for a critical business project.

What can you do?  Say “no way”? Offer a co-location option in a commercial facility as an option? Start looking for a new job?

I bumped into a possible solution a few days back on Twitter when I connected with Mary Hecht-Kissell (@PR_Strategies) who looks after Coolcentric. The problem set, defined above, that makes a clunker data centre is all about getting enough cold into servers to remove the excess heat. Every element in the clunker conspires to make delivering more cold air virtually impossible. That’s where the coolcentric solution makes a difference. It delivers cold water right up against the servers. It adds additional cooling capacity that enables that set of additional 10KW (or more) racks to be installed in a data centre that seemed like a lost cause. It’s a fairly simple piece of technology, that has been well engineered to be retrofitted to most types of existing cabinets. It’s a water cooled door.

The water cooled door is fitted onto the back of the rack so that the hot air exhausting out of the cabinet gets chilled immediately and very efficiently. Liquids are about 4000 times more efficient at removing heat from a server than air, so these water cooled doors can remove significantly more heat with very low pumping energy.

One smart way to think about it is that the water cooled door acts like a mini, contained hot aisle for environments (like our clunker data centre) where cabinet alignment, roof height and plenum problems make hot aisle containment impossible.

Sounds like a pretty decent alternate to Semtex!

  • http://www.switchanddata.com Herb Villa

    Well, I have always been a fan of C4. Always looks good when they blow something up on television.

    But to be serious – Why blame the data center? It is just a building/space/room/whatever with a bunch of inanimate stuff inside – cable, CRAC’s, racks, hardware, pipes, floors, etc. that did not install or manage themselves. No – one must get to the root source – PEOPLE. And especially the ones that live by one of my personal golden rule – “Even if you make something idiot-proof, there is always an idiot out there working harder to prove you wrong.”

    And as with most really important things it is critical to communicate. In the case of IT spaces – get IT AND Facilities talking. Set very specific operational rules – AND ENFORCE THEM. Make sure there ar clearly defined plans for doing ANYTHING in the space, especially governing moves, adds and changes.

    And even if you are stuck with a less then optimal space, do try and make the best of it. Hopefully without having to compromise operational functionality. And then see what is out there that may be helpful in solving some of the problems. New cable management solutions, different climate control options, improved power distribition, etc. And research any new solution with NO bias or prejudice – something like “I will NEVER use water in my data center” You got a problem, rule out no options.

    In a previous job, I saw some of these data centers – limited slab to slab clearance, no room to expand, unreliable and/or overextended infrastructure. And yet – was able to salvage them and turn them into first class IT spaces. All because of excellent communications bewteen all disciplines, a willingness to explore a wide variety of solutions, and a commitment to improvement.

    So don’t blame the space – look in the mirror and start from there.

  • http://thehotaisle.com Steve O’Donnell

    Great comment and thanks Herb.

  • Pingback: Mark Nankman

  • Paul Kee

    Steve,

    Certainly there are many products out in the marketplace offering similar solutions (Knuerr with the bottom mounted heat exchanger, Rittal with the side mounted heat exchanger). A few things need to be considered of course.

    How can the chilled water be routed out into the dataspace? Will this disrupt the business operations?
    How do you mitigate pipework failure in the dataspace?
    How can rack row configurations be accommodated? This cannot replace racks one for one as the racks will simply be deeper.

    The cooling distribution unit will inevitably use more energy in the overall system. Of course, this is countered by more efficient cooling (reduction in hot/cold air contamination etc).

    I have had both types of systems installed before (normal CRAC cooling and high density) and there is just no silver bullet. The datacentre operator will need to bite one of the many bullet to solve the issue.

    Have you had any experience in the installation of Coolcentric units that you can share? Are there general issues like fan redundancy, coil redundancy, low load bypass valves which need to be considered?