Sustainable HPC: How to Reduce Energy Consumption Without Sacrificing Performance

Researchers at Boston University propose strategies to balance computational efficiency and electrical costs in high-performance data centers.

High-performance computing (HPC) is essential for tackling scientific, medical, and engineering problems that require billions of calculations per second. However, this computational power comes at a cost: massive electricity consumption in data centers, impacting both the environment and organizational budgets.

Aware of this challenge, scientists Ayse Coskun and Ioannis Paschalidis from the Hariri Institute for Computing at Boston University are working to help data centers adopt energy management models that reduce electrical expenses without sacrificing performance. Their findings were recently published in IEEE Transactions on Sustainable Computing.


Why do HPC centers consume so much energy?

While processors in data centers can perform enormous volumes of operations in fractions of a second, they are not the only components demanding electricity. Experts say that data servers, network equipment, and cooling systems also require significant energy to operate continuously and reliably.


An opportunity to promote green energy

Regulating electricity consumption benefits not just individual data centers but also helps stabilize electric grids overall. “When data centers adjust their consumption, they help offset the variability of renewable sources like solar and wind,” explains Coskun. This facilitates greater adoption of clean energy, especially in regions where renewable generation is inconsistent.


How to save electricity without losing computing capacity

One of the most promising approaches is participation in demand response programs. These programs allow data centers to negotiate more favorable energy rates if they agree to temporarily reduce their consumption during peak demand periods.

“The challenge is minimizing impact on critical workloads when implementing power limits,” says Daniel Wilson, co-author of the study. “Our algorithms can identify sensitive loads and apply energy restrictions selectively, with minimal impact on performance.”


Global application, even in less resource-rich countries

Although not all electricity markets offer advanced demand response programs, smart energy management remains valuable. The techniques proposed by Coskun and Paschalidis enable data centers to adapt to dynamic energy prices or meet contractual commitments, which is especially relevant for countries with limited infrastructure.


Next steps: global models and real-world scenarios

Supported by the Hariri Institute and the Institute for Sustainable Energy, the researchers will launch a new phase of their project with Richard Stuebi from Questrom School of Business. Their goal is to study how these techniques perform in various international energy markets under real conditions and to develop recommendations applicable across different contexts.


Conclusion: In the era of artificial intelligence, HPC models will become even more in demand, but their sustainability will heavily depend on their ability to adapt to the energy landscape and optimize resources without sacrificing productivity. Initiatives like those from Boston aim for a future where intensive computing and energy sustainability go hand in hand.

Source: Boston University

Scroll to Top