• 3 minutes e-car sales collapse
  • 6 minutes America Is Exceptional in Its Political Divide
  • 11 minutes Perovskites, a ‘dirt cheap’ alternative to silicon, just got a lot more efficient
  • 10 hours GREEN NEW DEAL = BLIZZARD OF LIES
  • 8 days Does Toyota Know Something That We Don’t?
  • 2 days America should go after China but it should be done in a wise way.
  • 7 days World could get rid of Putin and Russia but nobody is bold enough
  • 9 days China is using Chinese Names of Cities on their Border with Russia.
  • 11 days Russian Officials Voice Concerns About Chinese-Funded Rail Line
  • 11 days OPINION: Putin’s Genocidal Myth A scholarly treatise on the thousands of years of Ukrainian history. RCW
  • 11 days CHINA Economy IMPLODING - Fastest Price Fall in 14 Years & Stock Market Crashes to 5 Year Low
  • 10 days CHINA Economy Disaster - Employee Shortages, Retirement Age, Birth Rate & Ageing Population
  • 1 day How Far Have We Really Gotten With Alternative Energy
  • 10 days Putin and Xi Bet on the Global South
  • 11 days "(Another) Putin Critic 'Falls' Out Of Window, Dies"
Felicity Bradstock

Felicity Bradstock

Felicity Bradstock is a freelance writer specialising in Energy and Finance. She has a Master’s in International Development from the University of Birmingham, UK.

More Info

Premium Content

AI’s Energy Consumption: A Silent Threat To Environmental Sustainability?

  • AI's increasing integration in daily life and industries requires vast computing power and electricity, raising concerns about its environmental impact due to high energy consumption. 
  • The methods employed for AI learning, such as deep learning, necessitate significant power usage, differing from human learning in its need for "brute force" statistical techniques. 
  • Companies such as Google and Microsoft currently don't disclose the energy consumption of their AI programs, leading to a call for transparency and government regulation to align with climate policies.

As energy companies look to modernize their operations to cut costs and make them more efficient, many are turning to technologies such as machine learning, robotics, and Artificial intelligence (AI). But as with any new technology, it’s important to consider the whole picture. While the use of AI and other tech could help to improve operations, both in fossil fuels and green alternatives, just how much energy does it take to power AI? 

There has long been a discussion over the impact of AI on the planet, but with all the buzz around impressive new technologies, it has mostly faded into the background. In 2020, researchers at OpenAI in San Francisco presented to the world an algorithm capable of learning that could move a robotic hand to manipulate a Rubik’s Cube. This marked a huge step forward in AI technology. At this point in time, it required over 1,000 computers and a dozen machines running specialized graphics chips to crunch complex calculations over months to achieve this feat. The process consumed approximately 2.8 gigawatt-hours of electricity according to estimates, equating to the output of three nuclear power plants for an hour – although this was not confirmed by the company. 

AI technology is advancing at a staggering rate, following decades of heavy investment. But with this progress comes concerns about what impact it could have on the environment. While the output may appear simple, as machines learn to answer questions, recognize images, and play games, the power required to carry out these tasks is immense. The running of AI requires huge amounts of computing power and electricity to create and solve algorithms. Sasha Luccioni, a postdoctoral researcher at Mila, an AI research institute in Canada, explained: “The concern is that machine-learning algorithms, in general, are consuming more and more energy, using more data, training for longer and longer.”  Related: Saudi Aramco Buys Carbon Credits At Largest-Ever Auction

And over the last few years, AI has gradually become more integrated into our everyday activities, such as answering questions via Alexa or Siri, routing Google Maps, or identifying people through photos, all available on our phones and home computers. But few people consider how much power it takes to complete these seemingly simple tasks. We often compare machines to humans when it comes to these types of tasks, assuming computers can answer questions just as our brains can, with relatively little effort. However, AI doesn’t learn information in a structured way, it doesn’t understand human concepts such as cause-and-effect, context, or analogies, meaning that it requires a deep learning “brute force” statistical technique approach to work. 

A deep learning model is trained very differently from our brains. For example, for it to identify an image of a cat, it is shown thousands of photos of cats that have been labeled by humans. The model does not understand that a cat is more likely than a dog to climb a tree or perform other feline activities and will only associate these objects if they are in the images. To make the model understand the image, it needs to be shown all the possible combinations until it learns. 

Until now, the response to the use of AI in energy operations to make them more efficient and reduce costs, has been overwhelmingly positive. But now, experts worry that the high energy demands of these types of technologies may have been widely overlooked. If AI is used in energy projects to support modernization and decarbonization, it may be causing more harm to the environment than we realize. While it could revolutionize trillion-dollar industries, from energy to retail, the creation of AI technologies such as chatbots and image generators will require a huge amount of electricity, which could add to the world’s carbon emissions. 

At present, there is a lot of ambiguity around the amount of energy it takes to power AI programs. For example, if you ask ChatGPT this question, it responds along the lines of “As an AI language model, I don’t have a physical presence or directly consume energy.” The complexity of AI programs means they consume much more energy than other forms of computing, but the companies that are building these programs, such as Google and Microsoft, are not disclosing how much. At present, we know very little about how much electricity and water it takes to train and power their AI models and what sources of energy are used to run their data centers. 

We’re seeing more and more companies incorporate AI into their operations as it becomes more widely available. Sasha Luccioni, the climate lead for the AI company Hugging Face, explained: “This exponential use of AI brings with it the need for more and more energy.” Luccioni added, “And yet we’re seeing this shift of people using generative AI models just because they feel like they should, without sustainability being taken into account.” 

The rapid advancement in AI technology has led companies across all sectors to incorporate AI programs into their operations, as they look to evolve. This can be viewed as positive, as companies are embracing technology and modernization, which could make operations more efficient. However, the issue of energy use in AI and similar technologies is often not being considered, meaning that companies are rapidly adopting AI programs without assessing their impact on the environment. Going forward, it is important that information about energy use in AI become more transparent and that governments regulate the sector in line with their climate policies. 

By Felicity Bradstock for Oilprice.com

ADVERTISEMENT

More Top Reads From Oilprice.com:


Download The Free Oilprice App Today

Back to homepage





Leave a comment

Leave a comment




EXXON Mobil -0.35
Open57.81 Trading Vol.6.96M Previous Vol.241.7B
BUY 57.15
Sell 57.00
Oilprice - The No. 1 Source for Oil & Energy News