Vijay GaddepalliAs a senior associate at MIT Lincoln Laboratory, he leads several projects at MIT. Lincoln Laboratory Supercomputing Center (LLSC) makes computing platforms and the artificial intelligence systems that run on them more efficient. Here, Gadepally discusses the growing use of generative AI in everyday tools, its hidden environmental impacts, and some of the ways Lincoln Laboratory and the larger AI community can reduce emissions for a greener future.
cue: What trends are you seeing in how generative AI is used in computing?
no way: Generative AI uses machine learning (ML) to create new content such as images and text based on data entered into the ML system. At LLSC, we have designed and built some of the largest academic computing platforms in the world, and over the past few years we have seen an explosion in the number of projects that require access to high-performance computing for generative AI. We’re also seeing how generative AI is transforming all kinds of fields and domains. ChatGPT, for example, is impacting classrooms and workplaces faster than regulations can keep up.
We can imagine all kinds of uses for generative AI within the next decade, including supporting high-performance virtual assistants, developing new drugs and materials, and improving our understanding of basic science. Although we cannot predict everything that generative AI will be used for, we can be confident that its impact on computing, energy, and climate will continue to grow very rapidly due to increasingly complex algorithms.
cue: What strategies is LLSC using to mitigate these climate impacts?
no way: We’re always looking for ways to make computing more efficient. Doing so will help data centers make the best use of their resources and help scientific colleagues advance the field in the most efficient way possible.
For example, we’ve been reducing the amount of power our hardware consumes by making simple changes, similar to dimming or turning off lights when you leave a room. In one experiment, we applied power limiting to reduce the energy consumption of a group of graphics processing units by 20-30% with minimal performance impact. This technology also lowers hardware operating temperatures, making GPUs easier to cool and last longer.
Another strategy is to change our behavior to be more climate conscious. At home, some of us may choose to use renewable energy sources or intelligent schedules. We are using similar technology at LLSC. For example, training AI models when temperatures are cooler or when local grid energy demand is low.
We also realized that a lot of the energy consumed in computing is often wasted, with water leaks increasing costs but providing no benefit to the home. We’ve developed some new techniques to monitor running compute workloads and then kill those that aren’t likely to produce good results. Surprisingly, in many cases we found that most computations could be terminated early without compromising the final result.
cue: What are examples of projects that reduce the energy output of generative AI programs?
no way: We recently built a climate-aware computer vision tool. Computer vision is an area focused on applying AI to images. So, differentiating between cats and dogs in an image, correctly labeling objects within an image, or finding components of interest within an image.
Our tool included real-time carbon telemetry, which generates information about how much carbon is emitted from the local grid while the model is running. Based on this information, our system automatically switches to an energy-efficient version of the model, typically with fewer parameters, during periods of high carbon intensity, or to a much higher fidelity version of the model during periods of low carbon intensity. .
With this, we saw a reduction of almost 80% in carbon emissions over a period of one to two days. We recently extended this idea to other generative AI tasks, such as text summarization, and found the same results. Interestingly, in some cases, performance improved after using our technique!
cue: What can we do as consumers of generative AI to mitigate climate impacts?
no way: As consumers, we can demand that AI providers provide greater transparency. For example, in Google Flights you can see a variety of options representing the carbon footprint of a particular flight. We need to get similar kinds of measurements from our generative AI tools so that we can consciously decide which products or platforms to use based on our priorities.
We can also strive to become more educated about generative AI emissions in general. Many of us are familiar with vehicle emissions, and it may be helpful to talk about generative AI emissions in comparative terms. For example, knowing that the task of creating one image requires roughly the same amount of energy as driving 4 miles in a gasoline car, or that charging an electric car requires the same amount of energy as generating about 1,500 pieces of text will surprise you. summation.
There are many instances where customers are willing to make trade-offs if they know the implications of that trade-off.
cue: How do you see the future?
no way: Mitigating the impact of generative AI on the climate is one issue that people around the world are working on with similar goals. We do a lot of work here at Lincoln Laboratory, but that’s just scratching the surface. In the long term, data centers, AI developers, and energy grids will need to work together to provide an “energy audit” to find other unique ways to improve computing efficiency. Moving forward requires more partnerships and collaboration.
If you would like to learn more about these efforts or collaborate with Lincoln Laboratory, please contact: Vijay Gaddepalli.