Artificial intelligence (AI) is transforming industries, driving innovation and efficiency, but its sustainability is crucial for long-term societal and environmental benefits.
A key challenge in sustainable AI growth is energy consumption, as large language models (LLMs) require rapid processing of large datasets, demanding significant computational power and leading to high energy consumption.
Recent comments from industry executives anticipate a factor of 100 times more computes required for the next generation of LLM training, while the energy needed to train a leading model is multiplied by 10 every two years.
The processing of LLMs is typically partitioned over different accelerators to optimize performance, driving key parameter needs.
Author's summary: AI growth faces energy consumption challenges.