data center cooling

It’s vital to keep your data center environment optimal to promote peak performance.

Data center cooling is a $20 billion industry. Cooling is the highest operational cost aside from the ITE load itself. It’s also the most important maintenance feature.

There are a few data center cooling best practices that can keep your data center humming along smoothly. These practices can help you to improve the efficiency of your data center cooling system. They can also help you to reduce costs.

It’s important to execute any changes to your data center cooling system carefully. For this reason, it’s vital to work with an experienced engineer before making any changes in a live environment.

To learn more about data center cooling best practices, continue reading.

The State of Data Center Environmental Control

Today, data center environmental control is one of the most widely discussed topics in the IT space. Also, there’s a growing discrepancy between older data centers and new hyperscale facilities. Despite age or scale, however, power utilization and efficiency are critical in any data center.

It’s well-known that data centers are one of the largest consumers of electricity around the world. Today, data centers used up to 1% to 1.5% of all the world’s energy. What’s more, energy usage will only increase as more innovations emerge. These innovations include:

  • Artificial intelligence
  • Cloud services
  • Edge computing
  • IoT

Furthermore, these items represent only a handful of emerging tech.

Over time, the efficiency of technology improves. However, those gains are offset by the never-ending demand for increased computing and storage space. Firms need data centers to store information that enables them to satisfy consumer and business demands.

Accordingly, data center power density needs will increase every year. Currently, the average rack power density is about 7 kW. Some power racks have a density of as much as 15 kW to 16 kW per rack. However, high-performance computing is demanding typically up 40-50 kw per rack.

These numbers are driving data centers to source the most energy efficient cooling systems available.

What Is the Recommended Temperature for a Data Center?

The American Center for Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) offers an answer to this question. ASHRAE suggests server inlet temperatures between 64.4° and 80.6°F. Furthermore, the society recommends a relative humidity between 20% and 80%.

The Uptime Institute, however, has a different opinion.

The Institute recommends an upper temp limit of 77°F.

However, many data centers run much cooler, especially older ones. IT workers prefer to err on the side of caution to avoid overheating equipment.

Data Center Cooling Calculations

Data Center IT It’s important to understand current conditions before making your data center cooling calculations. For example, you’ll need to assess the current IT load in kilowatts. You’ll also need to measure the intake temperature across your data center. This measurement should include any hotspots.

At a minimum, you want to record the temperature at mid-height. You’ll also want to record the temperatures at the end of each row of racks. Also, you should take the temperature at the top of the rack in the center of each row.

As you take measurements, record the location, temp, date, and time. You’ll need this information later for comparison.

Now, measure the power draw of your cooling unit in kilowatts. Typically, you’ll find a dedicated panel for this measurement on most units. You could also use a separate monitoring system to take this measurement.

You’ll also need to measure the room’s sensible cooling load. You’ll need to measure the airflow volume for each cooling unit for this task. Also, you’ll need to record the supply and return temperatures for each active unit.

Getting to the Math

You can determine a reasonable capacity for each operating unit and kilowatts using the following formula:

Q sensible (kW) = 0.316*CFM*(Return Temperature[°F] – Supply Temperature[°F])/1000
[Q sensible (kW) = 1.21*CMH*(Return Temperature[°C] – Supply Temperature[°C])/3600]

Now, you can compare the cooling load to the IT load to create a point of reference.

Next, you’ll make use of the airflow and return air temperature measurements. You’ll need to contact your equipment vendor for the sensible capacity of each unit in kilowatts. Now, total the sensible capacity for the units that are now in operation. This is about the most simplistic calculation that you’ll find. If you prefer, however, you can find much more complex methods for calculation online.

Next, you’ll need the overall room operating cooling sensible capacity and kilowatts from your measurements. You’ll divide the former by the latter to find the sensible operating cooling load. Now, you have a ratio to use as a benchmark to evaluate subsequent improvements.

Still, it’s important to consult with IT engineers. They can help you determine the maximal allowable intake temperature that will not damage your IT equipment in a new environment. Using your collected data, you can create a work plan to establish your goals. You can also use the information to determine metrics that you’ll monitor to ensure that the cooling environment functions properly.

You’ll also want to develop a back-out plan just in case you have any problems along the way. Finally, you want to pinpoint the performance metrics that you’ll track. For instance, you might track inlet temperatures. Conversely, you may monitor power consumption or other metrics.

Data Center Cooling Best Practices

It can prove challenging to figure out where to start with upgrades for data center environmental control. A few data center cooling best practices can help in this regard. There are many variables that can affect the airflow in your data center. These variables may include the types of data racks. They can even include the cable openings. By following airflow best management practices, however, you can avoid equipment failures. The following strategies can help boost your data center airflow management for improved efficiency:

  • Manage the cooling infrastructurecloud data center
  • Block open spaces to prevent air bypass
  • Manage data center raised floors

What follows are details for these strategies.

Best Practice 1: Manage the Cooling Infrastructure

Data centers use a lot of electricity. For this reason, they need an intense cooling infrastructure to keep everything working correctly. To put this in perspective, according to the US Department of Commerce, the power densities of these facilities, measured in kilowatts (kW) per square foot (ft2) of building space, can be nearly 40 times higher than the power densities of commercial office buildings.

If you need to improve the airflow in your data center, you may want to consider changing the cooling infrastructure. For example, you may reduce the number of operating cooling units to meet the needed capacity. Alternatively, you might raise the temperature without going over your server intake air temperature maximum.

Best Practice 2: Block Open Spaces

It’s vital to close all open spaces under your racks. It’s also important to close open spaces in the vertical planes of your IT equipment intakes.

You must also close any open spaces in your server racks and rows. Spaces here can cause your airflow balance to get skewed.

Also, you’ll want to seal any spaces underneath and on the sides of cabinets as well as between mounting rails. You’ll also want to install rack grommets and blanking panels. In this way, you’ll ensure that there aren’t any unwanted gaps between your cabinets.

Best Practice 3: Manage Data Center Raised Floors

Also, you’ll want to monitor the open area of the horizontal plane of your raised floor. Openings in your raised floor can bypass airflow. This circumstance can also skew the airflow balance in your data center.

You’ll want to manage the perforated tile placement on your raised floor to avoid this problem. You must also seal cable openings with brushes and grommets. Finally, you’ll need to inspect the perimeter walls underneath the raised floor for partition penetrations or gaps.

Choosing a Data Center Cooling Design

There are a few emerging data center cooling methods in the computer room air conditioning (CRAC) space, such as data center water cooling. For example, you might want to consider advanced climate controls to manage airflow.

State-of-the-art data centers incorporate new ways to optimize the cooling infrastructure for greater efficiency. Now, you can enjoy precision data center environmental control with several technologies. These technologies include:

Usually, the data center cooling methods that you choose are driven by site conditions. An experienced consultant can help you to select the right data center cooling design.

Your Partner in Data Center Air Control

Now you know more about data center cooling best practices. What you need now is a well-qualified expert in data center cooling. Data Aire has more than 50 years of experience. We’ve helped firms find innovative answers for  emerging demands.

At Data Aire, we’re a solutions-driven organization with a passion for creativity. Furthermore, we believe in working closely with our clients during the consultative process. We can give you access to extensive expertise and control logic. By partnering with us, you’ll enjoy world-class manufacturing capability recognized by leading international quality certifications.

Contact Data Aire today at (800) 347-2473 or connect with us online to learn more about our consultative approach to helping you choose the most appropriate environmental control system your data center.