In a world that is driven by artificial intelligence, our electricity consumption is rising at a faster pace. From chatbots to image generation tools, our dependence on AI is constantly increasing. But with the rapid AI growth lies a developing concern: an energy concern. Stephen Foulger, professor of Materials Science and Engineering at Clemson University, warned that the huge computing power required to train and use AI models is definitely increasing pressure on global electricity systems. As organisations expand and build bigger data centres, researchers are now looking for ways to make the tech more energy efficient. Foulger highlights how the challenge is difficult to ignore. From geopolitics to AI, the timesnownew.com gets the views from
the best in the world. With the help of US experts, we decode how the demand for energy is soaring with the rising AI use.
Q. Why Does AI Consume So Much Energy?
Foulger notes that modern AI tools depend on extremely large-scale computation that needs enormous electrical power. He said, “Current AI embodies large-scale matrix operations that are massive.”As per him, the energy demand does not emerge from training the AI but also from maintaining and deploying it at a wider level.
Large Language Models (LLMs) and image-generation models often process millions to trillions of operations, requiring high power.
Q. Which AI Tools Require The Most Power?
Foulger highlighted that popularly used
AI applications often consume large amounts of electricity. These include LLMs, spam filters, spell checkers, image generators and recommendation systems. However, he mentioned tools like ChatGPT and image-generation platforms are the most energy-intensive because of their scale and complexity. He added that image-related AI tasks need a larger processing time, something many users already notice while they render high-quality images.
Q. Why Are Data Centres Becoming A Concern?
The professor described the issue as concerning and compared it to the heat generated by everyday computers. He said that even conventional desktop computers warm up during heavy usage. When this is multiplied to massive data centres serving millions of users altogether, the energy needs rise adversely. “There’s just a huge amount of electrical energy being used to go through these models,” he added.As per him, cooling systems inside data centres also raise the overall power consumption, making the infrastructure more demanding. The professor estimated that AI currently accounts for around one per cent of global energy consumption. However, he believes that the scale is rising exponentially.
Q. What Solutions Are Researchers Exploring?
Foulger stated that he, along with his team, is developing next-generation polymer-based materials that could make AI computations more energy efficient. He believes that the core computational systems being developed may become nearly a thousand times more efficient for certain operations than current tech. As AI tools become popular and integrated into smartphones, vehicles and workplaces, their energy demands are likely to grow alongside usage. Foulger asserts that addressing the issue is necessary as AI continues to expand in our lives every day.