Smaller components could mean big savings for data centers

Written by
Scott Lyon, Office of Engineering Communications; Molly Seltzer, Andlinger Center for Energy and the Environment
Nov. 13, 2018

Researchers at Princeton and MIT have found a way to save big on power consumption for data centers while making a key electronic component much smaller.

The component, called a power converter, changes the flow of electricity to fit the needs of individual computer parts. All computers require converters to supply power, but the researchers are concentrating on those that work in data centers. In papers published this year and last year in IEEE Transactions on Power Electronics, the researchers describe two aspects of a new approach to designing power converters that will shrink their size while increasing energy efficiency.

“Every kilowatt-hour of electricity produced to power data centers passes through multiple stages of these converters," said Princeton’s Minjie Chen, assistant professor of electrical engineering and the Andlinger Center for Energy and the Environment, and the lead author of the two papers. "Improving their efficiency and reducing their size means the data centers can be smaller and more powerful while using less electricity for the same demand.“

Currently, converters are either single stage or multi-stage. The single-stage versions are cheap to make but less sophisticated and not used for high-performance systems such as data centers. Multi-stage versions provide better performance but their complexity makes them more expensive and harder to build.

Chen’s approach combines the two versions into a new type called a merged multi-stage power converter. The new approach maintains the performance of the multi-stage converter but is less expensive and more energy efficient. This approach uses a modular design method with reconfigurable parts, allowing the converters to be physically smaller as well as electronically more sophisticated than the older versions.

If adopted, data centers would be able to fit three times the number of servers in a given space. And the researchers estimate their technology would decrease wasted energy by 25 percent compared with leading commercial equipment. Wide adoption of the proposed system by data centers could save more than $500 million annually in U.S. electricity consumption. In terms of fossil fuel consumption, that same figure represents the equivalent to taking 1 million cars off the road each year.

Moving the concept from the lab to industry will require engineers to learn a new set of tools and concepts based around the unique design and control methods employed by the technology. "Our design method is significantly different from what companies are familiar with," Chen said.

Still, Chen envisions a future where his approach leads to power converters that can be programmed with software for any given use — a single unit that can be reprogrammed to use for solar panels, server farms or electric vehicles. The long-term goal, he said is “hardware that allows for a lot of flexibility in how it manages power.”

Other researchers on the paper include Khurram Afridi, University of Colorado, Boulder (now at Cornell University); Sombuddha Chakraborty, Texas Instruments; and David Perreault, Massachusetts Institute of Technology. The two papers won the first and second place prize paper awards from the IEEE Power Electronics Society in 2017 and 2018, respectively.

The research was sponsored in part by Texas Instruments and the Siebel Energy Institute.