Artificial intelligence has become very important in business and financial transactions, healthcare, technological development, research, etc. Consumers rely on AI when streaming video, doing online banking, or conducting online searches without even realizing it. Behind these functions are more than 10,000 data centers worldwide, each with huge warehouses containing thousands of computer servers and other infrastructure for data storage, management, and processing. There are currently over 5,000 data centers in the United States, and new data centers are being built every day in the United States and around the world. Often dozens of them gather right near where people live, attracted by abundant power and policies offering tax breaks and other incentives.
And data centers consume huge amounts of electricity. According to the Electric Power Research Institute, U.S. data centers consumed more than 4% of total U.S. electricity in 2023, and that share could increase to 9% by 2030. One large data center can consume the power of 50,000 homes.
The sudden need for so many data centers is creating enormous challenges for the technology and energy industries, government policymakers, and everyday consumers alike. Research scientists and faculty at the MIT Energy Initiative (MITEI) are exploring many aspects of this problem, from power sourcing to grid improvements, analytical tools to increase efficiency, and more. Data centers have quickly become the energy challenge of our time.
Unexpected demands bring unexpected solutions
several The company, which uses its data centers to provide cloud computing and data management services, has announced some surprising moves to power it all. Proposals include building its own small nuclear power plant near the data center and even restarting one of Three Mile Island’s intact reactors, which have been closed since 2019. (Another reactor at that plant partially melted in 1979, causing the nation’s worst nuclear accident.) Already, the need to power AI has delayed the planned retirement of some coal-fired plants and raised prices for residential consumers. It is becoming. Meeting the demands of data centers not only puts stress on the power grid, it also delays the transition to clean energy needed to stop climate change.
From a power perspective, the data center problem has many facets. Here are some of the things MIT researchers are focusing on and why they matter.
Unprecedented surge in electricity demand
“In the past, computing didn’t use much electricity,” says William H. Green, MITEI director and the Hoyt C. Hottel Professor of Chemical Engineering at MIT. “Electricity has been used to run industrial processes, power household devices such as air conditioning and lighting, and more recently to power heat pumps and charge electric vehicles. But now suddenly the electricity used in computing in general, and in data centers in particular, is becoming a huge new demand that no one could have predicted.”
Why the lack of insight? Electricity demand typically grows by about 0.5% per year, and utilities introduce new generators and make other investments as needed to meet expected new demand. However, data centers now coming online are creating an unprecedented surge in demand that operators may not have anticipated. Moreover, new demand is steady. In data centers, providing service all day, every day is critical. There must be no disruption in processing large data sets, accessing stored data, or running the cooling equipment needed to keep all the computers packaged together operating without overheating.
Moreover, even if enough electricity is generated, getting it to where it is needed can be problematic, explains MITEI researcher Deepjyoti Deka. “The power grid is a network-wide operation, and the grid operator may have enough generation in another location, or even elsewhere in the country, but the lines may not have enough capacity to carry the electricity to where they want it to go.” Transmission capacity therefore needs to be expanded, which Deka says is a slow process.
Then there is the “interconnect queue”. Sometimes adding new users (“loads”) or new generators to an existing grid can cause instability or other problems for everyone else already on the grid. These situations can lead to delays in bringing new data centers online. If there are too many delays, new loads or generators may have to wait their turn in line. Much of the current interconnection queue is already filled with new solar and wind projects. That delay is currently about five years. Meeting the demands of newly installed data centers while ensuring that service quality elsewhere is not compromised is a challenge that must be solved.
Find a clean power source
To further complicate matters, many companies, including so-called “hyperscalers” like Google, Microsoft, and Amazon, have publicly pledged to achieve net-zero carbon emissions within the next decade. Many people have made progress toward their clean energy goals by purchasing “power purchase agreements.” They sign contracts to purchase electricity from solar or wind facilities, for example, and sometimes provide the funds needed to build the facility. However, this approach to clean energy has limitations when faced with the extreme power demands of data centers.
Meanwhile, soaring electricity consumption is delaying the retirement of coal plants in many states. There are not enough renewable energy sources to serve both hyperscalers and existing users, including individual consumers. As a result, conventional power plants that run on fossil fuels like coal are needed more than ever.
As hyperscalers look for clean energy sources for their data centers, one option is to build their own wind and solar facilities. However, such facilities only produce electricity intermittently. Given the need for uninterrupted power, data centers must maintain costly energy storage devices. Instead, you can rely on natural gas or diesel generators for backup power. However, these devices must be coupled with equipment to capture carbon emissions and a nearby location to permanently dispose of the captured carbon.
This complexity is leading some hyperscalers to switch to nuclear power. As Green noted, “Nuclear energy is a good fit for the needs of data centers because nuclear power plants can reliably produce a lot of power without interruption.”
In a widely publicized move last September, Microsoft signed a deal to buy power from Constellation Energy for 20 years after it restarts one of the intact reactors at the now-shuttered Three Mile Island nuclear power plant. It has been concluded. 1979. If approved by regulators, Constellation plans to bring the reactor online by 2028, with Microsoft purchasing all the power it produces. Amazon also agreed to purchase power from another nuclear power plant that was on the verge of closure due to financial problems. And in early December, Meta issued a request for proposals to identify nuclear developers who could help the company meet its AI requirements and sustainability goals.
Other nuclear news focuses on small modular reactors (SMRs), factory-built modular power plants that could potentially be installed near data centers without the cost overruns or delays commonly encountered when building large-scale power plants. Google recently ordered SMRs to generate power for its data centers. The first phase of construction is scheduled to be completed by 2030, and the remaining phase is scheduled to be completed by 2035.
Some hyperscalers are investing in new technologies. For example, Google is pursuing its next-generation geothermal project, and Microsoft has signed a deal to purchase power from the startup’s fusion power plant starting in 2028. Although nuclear fusion technology has not yet been demonstrated.
Reduced power demand
Other approaches to providing sufficient clean power focus on making data centers and data center operations more energy efficient, performing the same computing tasks with less power. Using faster computer chips and optimizing algorithms to use less energy is already helping to reduce load and the heat generated.
Another idea being trialled is to move computing tasks off the grid to times and places where carbon-free energy is available. Deka explains: “If work needs to be completed by a certain deadline rather than immediately, can we postpone it or move it to a data center where electricity is more abundant and cheaper and/or somewhere else in the U.S. or overseas? vacuum cleaner? We call this approach ‘carbon-aware computing.’” Deka says it’s not yet clear whether all the work can be easily moved or delayed. “If you think about generative AI-based work, can you easily break it down into smaller tasks that can be done across the country, solve them using clean energy, and then put them back together? How much does it cost to do this kind of division of labor?”
Of course, this approach is limited by interconnect queuing issues. In other regions and states, access to clean energy is difficult. However, efforts are underway to relax the regulatory framework to make it faster and easier to develop critical interconnections.
What about your neighbors?
A major concern with all options for powering data centers is the impact on residential energy consumers. When a data center is built in a neighborhood, there are more practical issues than just aesthetic ones. Is your local electric service unreliable? Where will the new transmission lines be located? And who will pay for new generators, upgrades to existing equipment, etc.? When a new manufacturing facility or industrial plant comes to a neighborhood, new jobs are usually created to offset the downside. This is not the case in data centers where you may only need a few dozen employees.
There are standard rules for how maintenance and upgrade costs are shared and allocated. But with the advent of new data centers, the situation has completely changed. As a result, public agencies must now reconsider their traditional fee structures to avoid unduly burdening residents with the costs of infrastructure changes required to host data centers.
MIT’s contribution
Researchers at MIT are thinking about and exploring a variety of options to solve the problem of providing clean power to data centers. For example, they are investigating architectural designs that use natural ventilation to promote cooling, equipment layouts that allow for better airflow and power distribution, and energy-efficient air conditioning systems based on new materials. They are creating new analytical tools to assess the impact of building data centers on the U.S. power system and find the most efficient ways to provide clean energy to facilities. Other research looks at how to match the output of small nuclear reactors to the needs of data centers and how to speed up construction of such reactors.
The MIT team also focuses on developing decision support systems to determine the best sources of backup power and long-term storage and to locate proposed new data centers, taking into account power and water availability and regulatory considerations, and even potential . For example, significant waste heat could be used to heat nearby buildings. Technology development projects include designing faster, more efficient computer chips and energy-efficient computing algorithms.
In addition to providing leadership and funding to many research projects, MITEI serves as a convener to bring together companies and stakeholders to address this issue. At MITEI’s 2024 Annual Research Conference, a panel of representatives from two hyperscalers and two companies that design and build data centers came together to discuss their challenges, possible solutions, and where MIT research could be most beneficial.
As data centers continue to be built and computing continues to drive unprecedented power demands, scientists and engineers are competing to provide ideas, innovations, and technologies that can meet these demands, while also accelerating the transition to a decarbonized energy system. Progress will continue.