Data Center HVAC

Six Golden Safety Rules for Data Centers by Data Center Knowledge

Six Golden Safety Rules for Data Centers.jpg

Data Center Knowledge and Digital Realty Trust's Director of Environmental Occupational Health and Safety developed an article about the cultural shift in the data center industry.  As data centers have grown and become more complex, the risks have grown as well.

Programming editor for CSS language

Here's an excerpt:

There’s a culture shift afoot in the data center industry. It puts new emphasis placed on safer working conditions for data center engineers and electrical contractors where, historically, such emphasis was a second thought. As equipment has grown increasingly complex and sophisticated, so too have the risks. This new safety awareness is a welcome change.

Today’s data centers -- from the hyper-scale to the agile colocation centers -- have increased in size, complexity and importance. They use massive amounts of power and cooling to ensure reliable operations. Data center companies have come to recognize that safety excellence is imperative to maximize uptime for their customers while reducing operational risk. Of the many safety issues that today’s data center operators must consider, these six rules are at the top of the list

Data Center HVAC and the Race to Improve PUE

data center HVAC

Guest Blog By Scott Wilson, HVACSchool.org

 

Every industry has different ventilation and cooling needs that require some level of customization when designing and building systems to meet those demands. You could even say there are as many specialties in the HVAC trade as there are unique ways to tailor systems to different industrial applications. But few have emerged as quickly or have as big a role in the future of the trade as the specialized systems used to keep data centers cool.

Data centers are the engines that run the internet and are what make distributed computing possible, offering efficient performance at a low price. The trend of software being offered as a service (like Microsoft’s Office 365 product) and with the likes of Facebook and Google keeping all their horsepower in remote facilities, thousands and thousands of servers housed in data centers around the world are relied upon to run continuously day and night.

The internet never sleeps and data centers must never go down. Billions of dollars of commerce and information flow through those wires, so data center HVAC work is also mission critical. This has meant incorporating redundancies to keep temperature, humidity, and particulate count within acceptable parameters even if some units go down or if electricity fails.

By all accounts, the reliance on remote data centers will only increase for the foreseeable future. In fact, according to Data Center Dynamics, construction of new data centers will increase at a rate of about nine percent per year through 2019.

Data centers are all about power, and as the laws of thermodynamics dictate, that power turns into heat at some point. Without massive cooling arrangements, all those vital servers would eventually turn into molten metal.

 

Data Center HVAC Has Come a Long Way

 

Power Usage Effectiveness (PUE) is the metric used to measure the energy efficiency of data centers. To find the PUE, you simply take the total energy required to run a data center (including everything from lighting to HVAC) and divide it by the energy used just to run the servers. In 2007, the average PUE for all U.S. data centers was 2, meaning that 2 watts of overhead energy was used for every 1 watt used for computing power.

 

The initial approach to data center cooling was just to drop in big coolers and fans as a way to muscle through the heat. But since even a small reduction in that PUE number can mean millions of dollars in savings on energy consumption, there was a major incentive to develop more efficient and specialized systems.

 

Designs evolved to make use of hot and cool aisles, with servers venting into a hot aisle and large overhead plenums sucking the hot air into chillers to bring the temperature down. Cold air would then be pushed out again beneath raised floors and vented into the cool aisles, where the front edge of the servers would pull it in again.

 

Although this technique provided real advantages over brute-force, whole-room cooling, many HVAC contractors are taking it further still. Now, some centers are expressly located in cold climates to make maximum use of external air temperatures and reduce the reliance on chillers. Others use passive circulation techniques to reduce the need for powerful circulating fans.

 

Advances in HVAC design strategies and technology drove the average PUE down to about 1.7 by 2014. But some data center operators have gone even further, with Google hitting an average PUE of 1.12 across all its data centers as of early 2017.

 

A low PUE is a competitive advantage and HVAC contractors that can put in systems to drop that ratio have a serious edge over the competition. The latest methods for achieving new lows in that PUE number may not even be publicized yet.

 

As a relatively new practice area, data center HVAC work is also evolving at a much more rapid pace than other areas of the trade. According to Google’s VP of Data Center Operations, in almost seven years, the internet giant changed their data center cooling strategy five different times.

 

The tolerances and requirements to hit low PUE ratios while maintaining a high degree of reliability aren’t something you just learn on the job. Understanding the science behind air circulation and the refrigeration cycle are concepts you need on day one, making formal education and training more important for preparing to enter the trade than ever before.

 

Scott is an IT consultant based in the North West and the lead contributor for HVACSchool.org. As a dedicated resource for people exploring HVAC education and training options, the staff at HVACSchool.org works hard to keep up with the latest developments in the industry as a way to keep students and trades-people ahead of the curve.

You Might Also Like....

Data Center HVAC Lift Case Study

Contributor:  Mike Maloni, Service Project Executive at F.E. Moran Mechanical Services
Writer:  Sarah Block, Marketing Director at The Moran Group

 

data center HVAC lift

 

When a major Chicago data center needed an HVAC change, they contacted F.E. Moran Mechanical Services and Dearborn Engineering to get them out of a tight situation.

Data Center Serving 9th Largest Financial District in the World

Serving one of the largest financial districts in the world, this Chicago data center is an integral part of Chicago business. With more than 183,000 square feet of secure and reliable data center space, they keep businesses running with high-density power configurations. When F.E. Moran Mechanical Services was approached to provide HVAC service for the data center, they were eager to begin and face the upcoming challenges head on.

Two Inches to Spare

This project included adding chillers and evaporative condensers that added up to 20,000 pounds. The equipment needed to be lifted onto the roof of a 7-story building on LaSalle Street, a bustling, compact area.

F.E. Moran Mechanical Services worked with Dearborn Engineering to safely coordinate and complete the lift. They rented the largest truck crane available that was 600 tons. The two groups had to work together to coordinate and maneuver this large piece of equipment between CTA stops and buildings - working in a very tight spot. With the huge crane and tight location, great care needed to be taken in every move they made.

Another issue that showed itself was the deteriorating conditions of Chicago sidewalks along with the data tunnels beneath the sidewalk. The 600 ton crane could cause the sidewalk to crumble beneath it. A solution needed to be found.

Carefully Choreographed Dance

On Friday, February 20, 2015, F.E. Moran Mechanical Services and Dearborn Engineering performed the set-up of the lift site. It was no easy task. They calculated a short swing radius, distance of counter weights to the nearby building, pick-points, and roof edge of the buildings. In the end, they had mere inches between the nearby building and the crane. There was no room for mistakes.

To resolve the crumbling sidewalk issue, Dearborn Engineering cut the sidewalk, tied shoring onto the foundation of the building, and placed two outriggers for support.

On Saturday, the lift was completed: over 100,000 pounds of equipment (with the heaviest piece = 20,000 pounds) was hauled onto the roof of the data center and brought into the new penthouse. It took the entire next day to remove the set-up for the lift

In the end, the coordination, meetings, and detailed planning allowed the lift to go off without a hitch. By the end of the day on Saturday, all HVAC equipment, structural steel, panels, and electrical equipment was safely inside the building. Service Project Executive Mike Maloni said, "This project was both fun and challenging. With careful planning and coordination with the City of Chicago, Coresite, HITT Contracting, Stephenson Crane & Dearborn Engineering, the crane lift went seamlessly. This difficult task allowed us to really exercise our skills. We're very proud of the results."

You Might Also Like...