The Hot Aisle Logo
Fresh Thinking on IT Operations for 100,000 Industry Executives

Next week, Iceotope will announce that it has developed, patented and manufactured an extreme IT equipment cooling solution.  The solution tackles the problems of cooling servers in data centres all the way from the actual source of the heat – at the processor and memory component level – to its final destination – outside air by using a 100% liquid path, some 4000 times more efficient than air.

The system operates at an amazing PUE of just over 1 and can operate almost everywhere on the planet without refrigeration. It is silent, reliable and can house the very highest performance systems with the highest power parts at extreme density. If you plan to visit Supercomputing next week in Portland Oregon you can see the product on Booth 2355.

There are more pictures and further information on Liquid Cooled Blades

The heat produced by the components inside each server is captured effectively by immersing the server motherboards in individually sealed baths of an inert synthetic liquid coolant. With the heat now locked in to a liquid, subsequent stages of liquid (water) cooling can be implemented to efficiently transport the heat from source to final destination in the air outside the data centre as shown below.

Heat Flow

“End to end liquid cooling” thermal path

This approach entirely eliminates air in the heat transfer path between server components and the air outside the data centre. Water is already used for heat transport in many data centre facilities, although usually outside of the main server room, as it is around 3500x better than air in this role. The Iceotope solution, as shown in the technology demonstration video above, couples building-level water circuits (which may be pre-existing or newly installed) directly to safe, efficient and low pressure rack-level liquid cooling.

Because of the greater thermal efficiency of this “end to end liquid” cooling path, the building water circuit can be run much warmer – potentially eliminating the need for chiller plant and enabling year-round free cooling. With this approach, the 3 year cooling cost of a 1 megawatt data centre could be reduced from around $788,400 to around $52,560; a 93% ($735,840) reduction compared to air cooling. By enabling servers to be packed more tightly without compromising the cooling efficiency, the same approach could reduce the space required for the servers by 84%.

Liquid cooling is not new, many manufacturers used the process in the early days of computing including IBM and Cray Research. Today liquid cooled doors and cold plates are common on high density computer systems. Iceotope have developed a complete solution that overcomes the 21st Century Data Centre problem – what do we do with the heat?

Now the problem will be – what do we do with the data centre? Game changing.

Iceotope

Here is the Iceotope launch message:

Iceotope specialises in liquid cooling for high efficiency, high density server computers.

Iceotope has developed the first truly scalable, modular liquid cooling solution for modern rack-based server architectures.

Using this technology, Iceotope and its licensed partners produce servers that are highly suited for deployment in enterprise data centres, high performance computing (HPC) facilities and cloud computing infrastructure.

Iceotope-enabled servers:

  • Reduce facility operating costs (i.e. reduced PUE) and associated carbon output.
  • Enable greater compute capacity within a given electricity budget.
  • Enable greater use of existing facility space through greater server/rack density.
  • Can be deployed in non-traditional locations, freed from the constraints of air cooling.
  • Deliver whole-life TCO benefits compared to air-cooled systems and alternative means of liquid cooling.
  • Can be deployed in retro-fit or new-build sites.

Iceotope will be announcing more details about the technology and demonstrating systems at Supercomputing 2009 on Booth 2355. Supercomputing is held between November 14th and 20th in Portland Oregon.

  • Pingback: Christofer Hoff

  • Finlay J MacLeod

    Absolutely brilliant!

  • http://www.atov.com Martin Williams

    Wow! This is a genius yet simple idea. I thought this had been done long ago but now I see the difference – the components are all immersed in the liquid rather than using liquid to cool plates. Do you know if they have patents on their design?

  • http://thehotaisle.com Steve O’Donnell

    Martin,

    Iceotope have filed and had granted a number of patents both in the UK and US. The end-to-end design is brilliant and innovative. It makes the Data Centre obsolete – no need for all of the raised floors, fans, refrigeration and other cooling plant. Energy efficiency is stunning. It is totally silent – no fans screaming at high airflow rates.

    The design fits right between IBM’s innovative cold plate technology and the new liquid channel designs for high power chips. Both of these approaches are expensive to implement and manufacture. The Iceotope solution is simple and clean. Cheap to manufacture and totally reliable.

    Steve

  • Pingback: Friday data center tidbits. « The Server Room

  • http://www.iceotope.co.uk Keith Deakin

    Steve,

    Thanks for the great promo – we’re very excited and can’t wait to hear and build on the comments from both The Hot Aisle and SC 09.

  • Pingback: Announcing the next generation cooling solution | Blade Watch

  • http://www.eco-onnect.org Robert Hokin

    Very impressive and elegant solution as you describe it, Steve. As you say, potentially game changing. Investigating further, it appears that this is a very early stage company (website under development). too often we see solid UK engineering undermined by too little support from the public and private sector. How will they scale and do they have the resources to meet the market?

    As the UK’s not-for-profit cleantech support resource, we’d be happy to offer assistance if required. Even to the Sheffield area!

  • Ron van de Water

    Steve, this sounds like a very good solution. Any idea of the price point of these type of servers and the scalability of this type of solution?

  • http://thehotaisle.com Steve O’Donnell

    Ron,

    Thanks for your comment. I am told that the products are priced competitively and that the costs of encapsulation are sensible and comparible to a server chassis. The key thing is these machines do not need refrigeration or CRAC units. Overall the capital costs will be staggeringly lower.

    Equally the operational costs will be significantly lower even in a highly efficient data centre we will see electricity costs dropping 50 or 60%. In sites that have no cooling capacity left these will be a Godsend.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Robert
    Readers at The Hot Aisle get the news first. I did say that Iceotope would be launching this week and if you have a look the website is up. Iceotope have been operating in stealth mode, developing the technology, delivering the beta systems and filing the patents.

    The firm is well funded and will now progress onto the next stage – launch and customer acquisition. Iceotope have had a huge response from the market and it is clear that the time is right for liquid cooling. (We have been saying that here on the hot aisle for a long while – don’t forget we introduced data centre curtains).

    Iceotope tell me that they are actively looking for customers now.

    Steve

  • George

    Any word on NEBS compliance?

  • oldie

    Definitely not new and still impractical idea due to component accessibility which renders it not a not cost effective.

    For example a liquid cooled front door on the rack itself is a better way to move the cooling closer to the heat etc…..

    Congrats on getting VC funding for this but you ain’t going to sell any and the idea is not patentable as people have been doing this with oil for decades – first match I came across: http://www.tomshardware.com/reviews/strip-fans,1203-11.html

  • Pingback: Suburban Guerrilla » Blog Archive » Oh Good

  • Hendrik

    Nice approach though the idea of cooling components by having them completely sourrounded by liquid is quite old. I’ve seen guys cooling their gaming pcs in an aquarium completely filled with some kind of oil more than 5 years ago. Only the drives were attached on the outside.

    Hendrik

  • http://thehotaisle.com Steve O’Donnell

    Hi Oldie

    Component accessability is not an issue. No one EVER upgrades a server in an Enterprise Data Centre. No one ever attempts a fix in situ, not ever. In fact having these systems encapsulated protects them from thermal and mechanical shock making them more reliable.

    Cooled front doors are a good thing but they tend to interfere with the CRAC units in the room and still depend on air to move the heat for the last few feet or inches.

    Encapsulating equipment in dielectric fluid is absolutely not new technology. Lots of folks do it because it is smart and enables lots of heat to be removed efficiently. Iceotope have patented much of the associated technologies that make a liquid cooled blade system viable in a real data centre.

    The real intelectual property is the nature of the end-to-end cooling process that starts at the capsule and ends at the free cooler.

    Believe me this is game changing stuff

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Hi George

    NEBS is apparently on the radar along with the many other compliance issues the firm needs to sort out.

    Steve

  • Michael Betts

    Now if they can use the heat to make electricity they would really make a dent in the E-bills.

  • Pingback: Carlos

  • Jacob

    You should consider making equipment for home use too. Being able to replace noisy fans with quiet pumps would be a great benefit in home environments, and unfortunately the convection cooled PC is mostly just a dream.

  • Roland

    “The real intelectual property is the nature of the end-to-end cooling process that starts at the capsule and ends at the free cooler.”

    Don’t you mean “the plumbing”?

  • John Pombrio

    Heh. No component accessibility ever, eh? Just yesterday there was articles over the web about Cray upgrading their AMD CPUs to produce the fastest supercomputer on the planet. Then there is the huge servers that Google uses that have an enviable PUE of 1.19 and do not require UPS’s. I would go that route any day of the week. This sounds like a workable solution but I doubt you will get real world results for infrastructure and maintenance costs that come within a factor of 2 of Google’s low tech approach.

  • http://thehotaisle.com Steve O’Donnell

    John,

    I used to run BT’s IT with 70,000 servers. In my entire career I have NEVER upgraded a server. I doesn’t happen. Google does great with air cooled solutions, and some refrigeration when they need it. They also are working out how to do follow the moon to keep their data centres cool. Unfortunately in the real world that isn’t practical. Liquid cooling is inevitable, wake up and smell the coffee pal

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Roland,

    Iceotope are a server packaging company, so yes the intellectual property is indeed in the plumbing – extremely smart and joined up plumbing for sure. Game changing plumbing.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Jacob,

    I agree. Even my TV equipment makes too much noise because it generates heat and we need to dump it. Liquid cooling is silent, reliable and inevitable – watch the consumer space. We will see high end kit running liquid first and then it will become table stakes.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Michael,

    Interesting comment. High quality heat is a genuinely important feature and Iceotope can deliver water hot enough to shower in or run under the floor as a heater. When we get to liquid channels embedded in the silicon then we might actually be able to look at leveraging this to make a negative PUE. Watch this space.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Hendrik,

    The IP is NOT in the immersion, it is in the smarts around integrating it into a data centre. We have seen lots of kitchen experiments and game player hardware but this is the first and best technology that just gets the end to end solution. How do you make something that can be integrated into a data centre – I love it fantastic smart well though out technology.

    Steve

  • Ted Herring

    Looks like better execution of an existing concept, good for them. My concern would be redundancy, what if the primary water system or path fails? Does their design allow for 100% redundancy for high availability needs?

    Given the high density design and heat generated it would not take long for the motherboards, drives, etc to overheat if the primary cooling fails.

  • http://thehotaisle.com Steve O’Donnell

    Hi Keith,

    You need to be really proud of what your team have achieved here. Smart, innovative technology that delivers a joined up solution that matters.

    Well done.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Robert

    The Hot Aisle was able to scoop the news and hit the press early with this amazing new technology. The website is up, Iceotope are well funded and executing on a plan to change blade server packaging forever. The press launch has been amazing and the Internet is buzzing with these new ideas.

    Believe Iceotope will be in touch to get help.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Thanks Finlay,

    I know that you have run some of the biggest IT shops in the planet and it is incredible that a UK company has been able to innovate and deliver this game changing technology.

    Steve

  • http://thehotaisle.com Steve O’Donnell

    Hi Ted,

    You are obviously an operations guy, practical and to the point. Simple answer is yes, every critical component is backed up. Nx2 approach. Similar approach to fans and power supplies. SNMP traps are issued if components fail.

    The guys who pulled this together are Data Centre professionals and have the scars.

    Thanks

    Steve

  • kws

    “Component accessability is not an issue. No one EVER upgrades a server in an Enterprise Data Centre. No one ever attempts a fix in situ, not ever.”

    I’ll tell that to a few sysadmins that I know. They will get a chuckle out of that.

  • bc

    so this , innovative liquid cooling solution, is the sign that cloud computing is becoming popular?

  • georges

    the next thing that should be done is heat the building with that hot liquid!

  • http://www.renderplatform.net Duron

    Will the additional cost of the server and infrastructure ever weight up against the cost of cooling?
    And I don’t want te be the one swapping memory modules in those machines. :)
    It looks like a perfect for closed-system supercomputers or one-time-no-upgrade installations where if a node fails, the whole node can be replaced. A few weeks ago I read about a new UK supercomputer to calculate climate change, but it consumed a bazillion watts to operate. Perfect match for this!

  • http://thehotaisle.com Steve O’Donnell

    Duron

    Thanks for the comment. As the Iceotope boxes are made from commodity parts they are a similar price to other blade systems. When looked at end to end, the capital costs will be lower. There is no need for CRAC units or refrigeration plant at all.

    In an enterprise environment folks don’t do memory upgrades. Replacement cycles are 18 – 36 months as more advanced and efficient hardware is released. The key thing about the Iceotope solution is that blades can be swapped without changing the rack out.

    Steve

  • Pingback: Funny and relevant Data Centre Cartoon | The Hot Aisle

  • Pradeep

    Hi,
    Quite an interesting option.Any white papers around.Would like to delve in more in to the subject

    Could be good option as we move on towards greening the mother earth

    Rgds,

    Pradeep

  • Pingback: ICT nieuws t/m week 48 - 2009 - Bob's Nieuws