Energy Regulation and Data Centers

Author: Tom Deaderick

I understand "conservation". Almost every weekend, I hike in mountains of Tennessee, North Carolina or Virginia. I enjoy the outdoors and am a good steward of both my land and the public lands I enjoy. While I am certain that I favor conservation, I don't know if that's synonymous with being "green". A conservationist is interested in using as few resources as necessary to accomplish the same, or greater, productivity.

"Green" proponents seem to prefer that production, specifically U.S. production, be reduced, whether this achieves an overall conservation of resources or not. The "green" movement should never have become the attack on productivity that it has. A less productive America benefits no one, as less efficient industrial areas will simply consume more resources to produce the same, or typically, less.

Conservation and capitalism

No effective producer (capitalist) favors the waste of natural resources. Resources, such as energy, are expenses. Expenses reduce margins. Left completely alone, those businesses which are most efficient in the use of raw materials win out. These wins may take years or decades, but history shows that production costs (that includes raw materials) of any given service or product decreases over time. Two forms of regulation now target data centers. One form of regulation is essentially a raising of the bar, and the other a misguided "green" regulation, almost certain to result in greater depletion of resources for less overall production.

Data Center design is essential to energy efficiency, as well as equipment, as newer equipment will by rule run more efficiently as well as more efficient products being available in the marketplace every day. Below is a best case scenario, for a data center that is designed to be energy efficient:

  • Careful Hot/Cold Aisle Layout
  • High and low flow perforated floor tiles
  • Hot air exhaust via ceiling plenums
  • Flow-focused cabling
  • Enclosed cabinets with blanking panels

Because energy costs are a substantial component of monthly operations, designing a data center to be more energy efficient makes financial sense for any data center. With operations costs down, cabinet rental can be lowered and undercut the competition. Regulators, as well as certification bodies, such as the Uptime Institute, have implemented programs to increase the likelihood of more data centers being "green."

Raising the bar

There are regulations that "raise the bar" on everyone in a given industry. The Environmental Protection Agency's (EPA) Energy Star program is an example of legislation targeted – uniformly at each data center. The Energy Star program is effectively, a catalyst for more rapid change. By increasing the importance of energy consumption as an expense, the EPA selects this expense for improvement. It is important to realize that this selection may come at the expense of other naturally-occurring improvements and that it may produce unintended consequences, outside the immediate industry as well as inside.

Kill the goose

There are other regulations pending which favor one business over another with any number of arbitrary and nonsensical attributes. The Waxman-Markey Draft and the Cap and Trade regulations appear as arcane and inscrutable as the U.S. tax code. If the proposed legislation worked exactly as its proponents describe, achieving exactly, and only, the intended consequences, it might lower energy consumption. This Rube-Goldberg contraption would then achieve the positive results that might be achieved through the simple (by comparison) Energy Star system. However, the unintended negative consequences of Cap & Trade could be disastrous, and would by every realistic account drive business from the U.S.

Other challenges

The "natural selection" of more efficient data centers over less efficient ones is a relatively simple process (e.g. waiting for decreasing margins to drive less efficient data centers out of business or into reinvestment) that involves no outside action or additional bureaucracy. The only moving part is the price of energy. Catalyzing change is intrinsically more complex, even at its most simplified form. Assuming that the U.S. just can't wait for the natural selection of energy-efficient data centers, an assessment metric is required.

The Green Grid has published a standard called Power Usage Effectiveness (PUE). The PUE is the ratio of total facility power consumed to the IT equipment power consumed. Higher ratios are given to suggest the data center is less efficient in transferring total power to productive power. This metric is specifically not intended to compare one data center to another however, but instead to serve as an internal benchmark for improvement within a single data center. It is commonly misused to compare data centers.

Consider the PUE ratio for a colocation facility. The colocation facility controls the environmental systems and clients control the contents of each cabinet. An article on the Green Grid web site quotes a data center operator with a "healthy" PUE of 1.6% who admits that they achieved the low rating by "running close to the edge" of capacity. Rarely will a competent engineer advocate running any system at the maximum range of its capacity, as this is typically a recipe for disaster. Nonetheless, a data center at full capacity with an inefficient design would score better than an efficient data center at startup or lower capacity, making PUE a useless metric for data center comparison.

If the goal was truly conservation the focus would be on the contents of the cabinets rather than the data center shell. Virtualization is undoubtedly one of the most effective tools to impact energy conservation. A single virtualized server replaces multiple traditional servers, substantially reducing heat generation and power consumption. This technology is so valuable that even its tertiary impacts create substantial value. For example, disaster recovery planning is far easier and less expensive for virtual servers than traditional ones.

Charts projecting future data center energy consumption assume no change in technology or practice. Adoption of virtualization is increasingly common, and especially prevalent in the data center, as a high percentage of colocation space is dedicated to disaster recovery systems. Solid State Drives (Flash drives) offer the promise of reducing power consumption over the hard disk drives. At this stage of development, server design doesn't yet take advantage of the architectural differences between flash and hard drives, but these changes will certainly follow as laptop battery life drives greater efficiencies that cross over into the server design. One certainty about data center energy consumption predictions is that they will be wrong. Both of these new technologies impact the contents of the cabinet, which is where the greatest opportunity for conservation lies, not in the data center facility.

Using PUE to encourage improvements by a colocation facility is like trying to hammer the board into a nail. The cabinet's contents control the equation.


If the rate of energy consumption by data centers must be artificially catalyzed, the simpler, proven solution is certainly preferred, over an arcane model that is untested and prone to manipulation.