Here’s why Artificial Intelligence is so power-hungry

Artificial intelligence is getting more expensive to develop, and the cost is growing faster than the energy efficiency of the models. This is because they are trained many times with different structures, and the best one is selected. For example a model called Bidirectional Encoder Representations from Transformers (BERT) used 3.3 billion words from English books and Wikipedia articles for training. Training BERT required to go through the data set not once, but 40 times.

Also, AI models are growing larger every year. GPT-2, a language model similar to BERT has 1.5 billion weights in its network. It’s successor GPT-3 has 175 billion weights. Basically, the best model is the one that takes the most energy and money to develop.

Consequently, the carbon footprint of AI development is growing, and unless we switch to 100% renewable energy sources, it may stand at odds with the goals of cutting greenhouse emissions and slowing down climate change.

However, there are approaches to create more energy-efficient networks. One approach called shapeshifter networks shares weights and uses the same weights in multiple parts of the network. This makes the network smaller and thus more more energy-efficient.

Generally speaking, the AI community should put more efforts into developing energy-efficient training schemes. Otherwise, AI will become dominated by a select few who can afford to train neuronal networks.

Source: https://theconversation.com/it-takes-a-lot-of-energy-for-machines-to-learn-heres-why-ai-is-so-power-hungry-151825



Photo by Alena Koval from Pexels
Scroll to top