Exit Technologies

Data Center Energy Efficiency Best Practices Guide (Improve your PUE)

This Data Center Energy Efficiency Best Practices Guide was made to help data centers curb their primary operational cost: energy consumption.

As data consumption continues its exponential growth, data centers will need to maximize their energy efficiency to avoid being crippled by power costs.

This article will cover the best practices that data centers can follow to improve their energy efficiency.

First and Foremost: Measure to Improve

As Peter Drucker is so often quoted as saying:

If you can’t measure it, you can’t improve it.

Before you consider any of the other strategies in this article, it’s important that you have a measuring system in place already.

Google is so intent on measuring accurately that they sample their power usage effectiveness (PUE) at least once per second.

By measuring across the year, you account for seasonal weather variations and their effects on the data center’s cooling consumption.

The center of expertise for energy efficiency in data centers has a helpful guide on measuring data center energy efficiency.

Cooling

data-center-cooling-best-practices-for-energy-effiency

Cooling equipment is responsible for a large portion of a data center’s power consumption. For data center infrastructure specialists, finding new ways to improve power usage effectiveness (PUE) is critical, and therefore cooling is of primary interest.

Of the cooling equipment, chillers and CRACs use the most energy, so minimizing their workload is critical for efficiency.

Standard Energy Efficiency Best Practices for Easy Wins

While the aforementioned bits are widely adopted as standard best practices, many data centers haven’t updated their infrastructure recently or acquired the easy efficiency wins. Here’s a diagram from submer.com to illustrate the cold aisle/hot aisle concept:

Aside from optimizing existing cooling paradigms, there are several other methods of cooling which can provide a significant efficiency boost beyond the standard practices.

One of these methods is direct evaporative cooling.

Direct Evaporative Cooling

Direct evaporative cooling (DEC) uses misting to provide the substrate for evaporation. If you’ve ever seen the fans blowing the mist in Las Vegas to cool off pool-side vacationers, you’ve seen this mechanism in action.

Here’s a diagram of how direct evaporative cooling works from dchuddle.com:

According to Anne Wood, an executive at Phoenix MFG, inc, it is not uncommon to realize a 50% increase in efficiency from DEC. However, there are a few considerations:

Obviously, to use water for cooling, the data center will need access to a reasonable volume of water.

Direct evaporative cooling also requires a system to purify the water, store backup water, pump the mist, and regulators to control water flow and pressure.

Additionally, direct evaporative cooling is introducing humidity into the data center, which makes direct evaporative cooling more viable in certain use-cases than others.

In particular, data centers in drier climes with access to water would likely benefit from a DEC solution.

Before considering one, humidity sensors and predictive modeling can help predict whether DEC will push ambient room humidity levels past the standard limits of 60%.

Indirect evaporative cooling is a somewhat less common, but equally viable solution using the same concept.

Indirect Evaporative Cooling

It basically takes warm air from outside the data center and brings it through a heat exchanger. The heat exchanger facilitates evaporation, cooling the air as it’s sent into the data center.

At the same time, humidity and heat are excreted back out from the exchanger. The con of this is that a heat exchanger does lose a few degrees of cooling over direct evaporative cooling.

Indirect evaporative cooling also requires two fans instead of one, like indirect evaporative cooling.

The benefit of Indirect evaporative cooling is that it does not introduce any humidity or outside elements into the data center environment, which may or may not be a concern.

To summarize:

The most innovative of recent cooling strategies, however, is immersion cooling.

Immersion Cooling

Alibaba has committed to using immersion cooling in its data centers and estimates it will save space by 75%, increase power density, and reduce operational costs by 20%.

As far as data center energy efficiency best practices, immersion cooling will very soon be a standard on that list.

Alibaba’s immersion cooling tech from datacenterdynamics.com:

Despite showing such significant promise to reduce data center PUE, the idea of liquid immersion is uncomfortable for many IT companies.

As technology improves and more early adopters come forward, it’s become apparent that immersion cooling is not only viable but is imperative if data centers are going to keep up with the rising efficiency needs of modern data centers.

In high-density data center deployments, immersion cooling will likely be essential, but existing data centers are slower to adopt the technology.

Some of the considerations with immersion cooling are expense, mess, infrastructure retrofitting, hard drive compatibility, vendor compatibility weight, safety, floor space, and resource consumption.

While valid concerns with extensive retrofit projects, they no longer present sufficient drawbacks to prevent widespread adoption in future deployments.

We’ll take a high-level look at each one.

Expense

Immersion cooling came with cost premiums that offset many of its energy efficiency advantages, especially for retrofit use cases. Factors responsible for these cost premiums include:

Mess

The fact of the matter is that oil and other dielectric fluids can increase maintenance labor requirements. This varies depending on the solution provider used.

Infrastructure Retrofitting

In the past, more universally viable immersion solutions were not available. This necessitated a total data center retrofit or from-scratch build project to accommodate the liquid cooling technologies.

While it may still be impractical to gut more legacy data center for new cooling tech, immersion cooling is the more efficient choice for most new data center builds.

Hard Drive Compatibility

Standard hard drives cannot be submerged in liquid cooling systems.

However, sealed spinning disk drives are an option, and solid-state drives are taking up more floor space in modern data centers.

Additionally, modified drive caddys can be used to keep drives above the oil’s surface.

Vendor Compatibility

In the past, vendors would void warranties, but as immersion cooling has become more established in data center infrastructure, vendors no longer void warranties with immersion cooling usage.

Additionally, many immersion cooling solutions, like GRCs, are compatible with every major server vendor and most rack setups.

Weight

Weight has historically been a challenge in liquid immersion cooling, and more specifically for rack-mounted solutions.

The weight can be a valid concern, but more as a result of system density than the weight of the fluid.

Additionally, air cooling infrastructure is no longer necessary with immersion cooling, which significantly reduces total weight on account of replacing heavy CRACs/chillers/optimizers, etc.

With the floor loading capabilities of modern data centers and space savings of immersion cooling, weight shouldn’t be a preventative concern.

Safety

Most immersion cooling uses non-flammable, non-toxic fluids. Slips from fluid spills are a valid concern.

Floor Space

One of the bigger value adds from immersion cooling is the floor space savings they provide.

Immersion cooling is space-efficient and eliminates the need for all of the infrastructure required for air cooling.

Resource Consumption Recap: Cooling Systems

Immersion cooling consumes radically less energy and water than other cooling options. As data centers’ contribution to world resource consumption increases, it will be ever more critical to improve resource consumption efficiency in our data center infrastructure systems.

Based on all of the relevant considerations, many corporations are realizing the potential superiority of immersion cooling, with one going as far as building an entire data center underwater.

While it may not be viable for your data center, give serious consideration to the potential of immersion cooling for your hardware.

Power

Distribution

It’s estimated that a third of server energy is wasted before it gets used for computation.

Much of it is lost at the power supply, where AC is converted to DC, and the voltage regulator, where the PSU’s output is converted to the voltages that microchips use. As a result, investing in efficient power supplies and voltage regulators is key.

One small change is to place the backup batteries on the server racks themselves and cut out one AC to DC conversion stage.

Another best practice is to arrange the higher voltages closer to the PSU than the lower voltages, reducing line loss.

Batteries: Li-ion batteries, VRLA, or Nickel-Zinc?

Though likely not the first thing you think about when data center energy comes up, batteries are an integral part of power within data centers. In this section, we’ll go over three of the primary battery options.

Valve regulated lead-acid (VRLA) batteries have been the standard for years, but do have drawbacks over other options:

VRLA is less energy efficient than other options

Li-ion batteries on the other hand:

Nickel Zinc Batteries (From ZincFive):

While Nickel-Zinc technology itself is nothing new, it has never been applied to data centers before in the same way. It may prove a safer, more environmentally friendly option for data centers.

Virtualization

With server virtualization, a data center doesn’t need as many servers to handle its workload.

With fewer servers, the total energy consumption can be greatly reduced. While beyond the scope of this article, Gartner recently put out an update of the virtualization landscape.

If you are running an outdated version and are thinking about upgrading to Windows Server 2019 then Read our blog on Windows Server 2019 Key Features for more information.

AI

AI can be invaluable in the data center, with Google citing a 40% increase in cooling efficiency after letting loose DeepMind on its data center.

While it’s not likely that Google will be sharing Deepmind as an open-source giveaway any time soon, other companies like Verdigris offer potential solutions to leverage deep learning for data center energy efficiency:

It is inevitable that other solutions will pop up to provide similar capabilities in the future, though the pool is fairly dry for now.

It’s important to note that deep learning and AI-based systems to improve energy efficiency are not the same as Data Center Infrastructure Management (DCIM) tools.

DCIM

DCIM is fundamentally different than AI, in that it still places all of the agency in the hands of the humans.

DCIM tools can be used to optimize data center operations, but with the gargantuan mass of data they cope with, the tools simply can’t process and act on the insights like an AI can.

That being said, energy-specific DCIM systems like Schneider Electric’s Ecostruxure IT can prove incredibly helpful and more accessible than the fruits of AI’s labors.

With increased visibility into the goings-on of your data center, you can make more informed decisions; at least until the deep-learning programs can handle it all for us.

Many organizations have servers powered on but not doing anything of use; DCIM tools allow managers and administrators to find these servers and clean up poorly optimized workloads, as well as easily manage building and environment controls for better energy efficiency. 

Don’t simply rely on the tools however: best practices for data center energy efficiency involve manual inspections for insights potentially missed on network tools.

Conclusions: Improving Efficiency and Power Cost Savings

By utilizing the technique, strategy, and technology in this guide for data center energy efficiency best practices, you can steadily move closer towards that perfect 1.0 PUE index near companies like Google and Facebook, and with that, enjoy the operational cost savings and lower environmental impact that comes with a more efficient data center.

While there is a lot here, there’s no need to tackle everything at once.

Do you know your cooling falls behind industry standards? Perhaps reach out to solutions providers to improve that area.

Do you lack true visibility to help you manage your data center effectively? Begin by evaluating the DCIM tools market to help you make informed decisions.

Need to clear floor space and have aging servers? Look into your server virtualization options with the team. Regardless of the area, moving forward with consistent effort towards lagging areas will nearly always pay off in OPEX or environmental impact improvements.

Moving Forward: Everything Must be Considered for Future Proofing of New Data Centers

With data centers consuming more and more energy, we need to be cognizant of our impact on infrastructure, our energy systems, and the world.

Retrofitting existing data centers may only make sense in limited capacities, as many solutions are not cost-effective nor worth the effort for existing facilities.

Where these practices are really important are after liquidating existing data centers and building out the new ones.

If you’re considering what type of servers to get for a new data center, make sure to check out our posts on white box servers and comparing HPE vs Dell servers to find what’s best for your needs.

More energy-efficient data centers are going to be a part of making them more cost-effective both for data center operators and end-users, as well as making each individual data center lower impact. As demand for data center capacity continues to rise it will be increasingly important to have more efficient systems in place to balance their power needs and keep costs down for everyone.

As long as we put serious effort to improve our energy efficiency, we can keep our environmental impact to a minimum and enjoy this big blue marble in the sky we call home for years to come.

Exit mobile version