Submerged Servers?
Ever have one of those moments when you connect the dots on a couple of ideas? In my mind, I think 'there has to be a better way to do that'. As the program manager & board member of the ACP (Association of Contingency Planners, South Texas chapter - not really important to the story but a plug nonetheless), I invited two vendors to talk about fire suppression in data centers at one of our monthly chapter meetings.
I had asked them to do part presentation and part demonstration. The presentation was standard, informative stuff - sealing vents, suppression disbursement, chemical types, etc. The demo, however, was like a magic trick. One of the guys showed us a liquid chemical (3M Novec) that suppressed fire by cooling the surrounding area. My perception was that fire could only be suppressed by oxygen deprivation. The concept is that if you
reduce the heat vs. eliminating oxygen, the fire will go out. The chemical evaporated and cooled everything around it. When a lit match was moved over the top of the chemical bottle, the flame whimpered and went out; Ditto with a candle that was placed next to the opened chemical bottle. I learned something new.
Next, one of the vendors poured the chemical into a bowl, pulled out his BlackBerry cell phone (video here), put it on speaker mode, and dropped the phone into the bowl of liquid Novec. Surprisingly, the chemical was non conductive - meaning it did not short circuit the phone like water would (water is highly conductive. When you wash your phone in your jeans pocket or get pushed in the pool, your phone damaged, usually beyond repair). The phone worked great submerged in the chemical and the phone worked fine after pulled the phone from the chemical and held it in the air. You could hear the conversation over the speaker. The chemical was safe to the touch and did not require any respirators (not harmful to breathe).
Neat idea - a liquid that is non-conductive, safe, and uses cooling as its suppression method.
I have visited several data centers throughout the years and in general they are designed basically the same - maximum floor space while balancing the cost of environmentals. (Now before every data center designer corrects me, let me acknowledge that many data center designs vary through specialization and competitiveness but the general problem is floor space and environmentals). Data centers are large facilities that can implement large scale, redundant solutions for hundreds or thousands of server racks. The goal is to leverage those expensive solutions across all racks of servers - spreading the cost to everyone. More participants equates to less cost per user. Most companies do not have the capital to invest in these large scale efficiencies alone and lease floor space inside a data center to offset the cost. This is one of the financial appeals of data centers.
The efficiencies and redundancies center on power, cooling, and networking. Power and networking are easier to design based on their predictable behavior, standards, and physics. Cooling is the tricky part. Components in racks (like servers) can operate at different specifications (think 16 racked servers vs. 16 bladed servers - different power consumption rate, different cooling requirements). Compound the servers with storage, hard drives, networking switches, WAN devices, etc, etc. and you have a large set of variations. You can 'worse-case' the data center design by targeting a certain temperature and move hot air out universally. If you want to be more efficient, you need to monitor your entire data center with thermal imaging. The imaging can show you generally what parts of server racks need more or less cooling and/or air movement. This type of imaging would allow you to fine tune the environment, keep it in spec, and allow (hopefully) for less than 'worst-case' cooling. The savings through fine-tuned thermal imaging are adequate but not exponentially significant.
Many data centers have large scale UPS (uninterruptible power supplies) and diesel generators to apply power when the outside utility company cannot provide power. Generally, this occurs during weather events or other calamity beyond the utility company's control. Data center cooling is supplemented with large external water tanks that contain thousands of gallons of chilled water to assist with cooling.
If cooling is one of the largest expenses in a data center, a liquid cooled data center has to be on its way. The big idea would be to take standard, unmodified, air cooled servers and submerge those servers (with power running, maybe fan removed) into a pool of non-conductive, non-corrosive cooling liquid. Like the cell phone in the fire suppression demonstration, the server runs unaffected by the liquid. Reducing data center cooling costs would be the catalyst for this technology. The company that can develop the chemical soup to support existing and future servers will be the heroes of data centers and likely Wall Street. Server companies will only need to place a new specification with their servers - liquid displacement. According to a GigaOm report, some startups have emerged says that its liquid cooled technology can cut data center cooling costs by over 90%. Another startup Green Revolution Cooling claims their solutions can cut data center cooling costs by almost half. These claims would be exponential savings. The future will see a new data center design where our off-the-shelf servers can be air or liquid cooled.
Picture of an Intel-Green Revolution Cooling experiment running in liquid
Fire Suppression chemicals using cooling Data Centers = submerged servers
Infrastructure Engineer at United Airlines
10ywow