This blog is co-authored by Professor Aleksandra Przegalińska and Denise Lee.
As artificial intelligence (AI) moves from the virtual world to the real world of real-world applications, it is becoming clear that bigger is not always better.
Recent experiences with AI development and deployment have revealed the power of a tailored, ‘proportional’ approach. While the general trend is to pursue increasingly larger models and more powerful systems, the AI community is increasingly recognizing the value of right-scale solutions. This more focused and efficient approach has proven to be very successful in developing sustainable AI models that not only reduce resource consumption but also deliver better results.
By prioritizing proportionality, developers have the potential to create AI systems that are adaptable, cost-effective, and environmentally friendly without sacrificing performance or functionality. This shift in perspective is driving innovation by aligning technological advancements with sustainability goals, demonstrating that ‘smarter’ often trumps ‘bigger’ in the realm of AI development. This realization is prompting a reevaluation of the fundamental assumptions about AI advancement that considers not only the raw capabilities of AI systems, but also their efficiency, scalability, and environmental impact.
From our vantage points in academia (Aleksandra) and business (Denise), we observed important questions emerging that required significant reflection. How can we harness the incredible potential of AI in a sustainable way? The answer lies in an incredibly simple yet insanely overlooked principle: proportionality.
The computational resources required to train and operate generative AI models are significant. To put this in perspective, consider the following data: The researchers estimated that training a single large-scale language model could consume about 1,287 MWh of electricity and emit the equivalent of 552 tons of carbon dioxide.(1) This is similar to the energy consumption of the average American household over 120 years.(2)
Researchers also estimate that AI’s electricity demand will reach 85 to 134 TWh per year by 2027.(3) To put this figure into context, it surpasses the annual electricity consumption of countries such as the Netherlands (108.5 TWh in 2020) or Sweden (124.4 TWh in 2020).(4)
These numbers are important, but it’s important to consider them in the context of AI’s broader potential. AI systems have the capacity to increase efficiency in various sectors of the technological environment and beyond, despite their energy requirements.
For example, AI-optimized cloud computing services have shown the potential to reduce energy consumption in data centers by up to 30%.(5) In software development, AI-based code completion tools can significantly reduce the time and computational resources required for programming tasks, potentially saving millions of CPU hours per year across the industry.(6)
Nonetheless, balancing AI’s energy needs with its potential for increased efficiency is where proportionality is needed. It’s about right-sizing your AI solutions. Use a scalpel instead of a chainsaw. When a fuel-guzzling SUV is too much for you, opt for a nimble electric scooter.
We are not suggesting we abandon cutting-edge AI research. Far from it. But we can make smarter decisions about how and when to deploy these powerful tools. In most cases, smaller, more specialized models can do the job and have a much lower environmental impact.(7) It’s really about smart business. efficiency. Sustainability.
However, shifting to a proportional mindset can be difficult. This requires the ability to leverage AI at a level that many organizations still struggle with. This requires a strong interdisciplinary dialogue between technology experts, business strategists, and sustainability experts. This collaboration is essential to developing and implementing truly intelligent and efficient AI strategies.
These strategies prioritize intelligence in design, efficiency in execution, and sustainability in practice. The role of energy-efficient hardware and networking in data center modernization cannot be overemphasized.
By leveraging cutting-edge power-optimized processors and highly efficient networking equipment, organizations can significantly reduce the energy usage of AI workloads. Additionally, implementing a comprehensive energy visibility system can provide valuable insight into the emissions impacts of AI operations. This data-driven approach allows companies to make informed decisions about resource allocation, identify areas for improvement, and accurately measure the environmental impact of their AI initiatives. As a result, organizations can not only save money, but also demonstrate real progress toward their sustainability goals.
Paradoxically, the most impactful and smartest application of AI may be to optimize both performance and environmental considerations by utilizing fewer computing resources. By combining proportional AI development with state-of-the-art, energy-efficient infrastructure and robust energy monitoring, we can create a more sustainable and responsible AI ecosystem.
The solutions we create do not come from a single source. Academia and business have much to learn from each other, as we have learned through our collaboration. AI that scales responsibly will be the product of many people working together on an ethical framework, integrating diverse perspectives, and committing to transparency.
Let’s make AI work for us.
(1) Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L.-M., Rothchild, D., So, D., Texier, M., & Dean, J. (2021 ). Carbon emissions and large-scale neural network training. arXiv.
(2) Mehta, S. (July 4, 2024). How much energy does LLM consume? Revealing the power of AI. Association of Data Scientists.
(3) De Vries, A. (2023). The energy footprint of artificial intelligence is growing. line, 7(10), 2191-2194. doi:10.1016/j.joule.2023.09.004
(4) De Vries, A. (2023). The energy footprint of artificial intelligence is growing. line, 7(10), 2191-2194. doi:10.1016/j.joule.2023.09.004
(5) Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for deep learning in NLP. 1 Proceedings of the 57th Annual General Meeting of the Association for Computational Linguistics. doi:10.18653/v1/p19-1355
(6) Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for deep learning in NLP. 1 Proceedings of the 57th Annual General Meeting of the Association for Computational Linguistics. doi:10.18653/v1/p19-1355
(7) Court Group. (2024). A smaller, more efficient artificial intelligence model: Cottgroup.
Share: