Artificial intelligence (AI) is transforming industries, driving innovation and efficiency, but its sustainability is crucial for long-term benefits.
Sustainable AI growth faces challenges, including high energy consumption. Large language models (LLMs) require significant computational power, leading to high energy consumption.
Industry executives anticipate a factor of 100 times more computes required for the next generation of LLM training, while energy needed to train a leading model is multiplied by 10 every two years.
Partitioning processing over accelerators is driving key parameters, optimizing performance for LLMs.
Author's summary: AI's energy consumption challenge requires innovative solutions.