As energy companies look to modernize their operations to cut costs and make them more efficient, many are turning to technologies such as machine learning, robotics, and Artificial intelligence (AI). But as with any new technology, it’s important to consider the whole picture. While the use of AI and other tech could help to improve operations, both in fossil fuels and green alternatives, just how much energy does it take to power AI?
There has long been a discussion over the impact of AI on the planet, but with all the buzz around impressive new technologies, it has mostly faded into the background. In 2020, researchers at OpenAI in San Francisco presented to the world an algorithm capable of learning that could move a robotic hand to manipulate a Rubik’s Cube. This marked a huge step forward in AI technology. At this point in time, it required over 1,000 computers and a dozen machines running specialized graphics chips to crunch complex calculations over months to achieve this feat. The process consumed approximately 2.8 gigawatt-hours of electricity according to estimates, equating to the output of three nuclear power plants for an hour – although this was not confirmed by the company.
AI technology is advancing at a staggering rate, following decades of heavy investment. But with this progress comes concerns about what impact it could have on the environment. While the output may appear simple, as machines learn to answer questions, recognize images, and play games, the power required to carry out these tasks is immense. The running of AI requires huge amounts of computing power and electricity to create and solve algorithms. Sasha Luccioni, a postdoctoral researcher at Mila, an AI research institute in Canada, explained: “The concern is that machine-learning algorithms, in general, are consuming more and more energy, using more data, training for longer and longer.” Related: Saudi Aramco Buys Carbon Credits At Largest-Ever Auction
And over the last few years, AI has gradually become more integrated into our everyday activities, such as answering questions via Alexa or Siri, routing Google Maps, or identifying people through photos, all available on our phones and home computers. But few people consider how much power it takes to complete these seemingly simple tasks. We often compare machines to humans when it comes to these types of tasks, assuming computers can answer questions just as our brains can, with relatively little effort. However, AI doesn’t learn information in a structured way, it doesn’t understand human concepts such as cause-and-effect, context, or analogies, meaning that it requires a deep learning “brute force” statistical technique approach to work.
A deep learning model is trained very differently from our brains. For example, for it to identify an image of a cat, it is shown thousands of photos of cats that have been labeled by humans. The model does not understand that a cat is more likely than a dog to climb a tree or perform other feline activities and will only associate these objects if they are in the images. To make the model understand the image, it needs to be shown all the possible combinations until it learns.
Until now, the response to the use of AI in energy operations to make them more efficient and reduce costs, has been overwhelmingly positive. But now, experts worry that the high energy demands of these types of technologies may have been widely overlooked. If AI is used in energy projects to support modernization and decarbonization, it may be causing more harm to the environment than we realize. While it could revolutionize trillion-dollar industries, from energy to retail, the creation of AI technologies such as chatbots and image generators will require a huge amount of electricity, which could add to the world’s carbon emissions.
At present, there is a lot of ambiguity around the amount of energy it takes to power AI programs. For example, if you ask ChatGPT this question, it responds along the lines of “As an AI language model, I don’t have a physical presence or directly consume energy.” The complexity of AI programs means they consume much more energy than other forms of computing, but the companies that are building these programs, such as Google and Microsoft, are not disclosing how much. At present, we know very little about how much electricity and water it takes to train and power their AI models and what sources of energy are used to run their data centers.
We’re seeing more and more companies incorporate AI into their operations as it becomes more widely available. Sasha Luccioni, the climate lead for the AI company Hugging Face, explained: “This exponential use of AI brings with it the need for more and more energy.” Luccioni added, “And yet we’re seeing this shift of people using generative AI models just because they feel like they should, without sustainability being taken into account.”
The rapid advancement in AI technology has led companies across all sectors to incorporate AI programs into their operations, as they look to evolve. This can be viewed as positive, as companies are embracing technology and modernization, which could make operations more efficient. However, the issue of energy use in AI and similar technologies is often not being considered, meaning that companies are rapidly adopting AI programs without assessing their impact on the environment. Going forward, it is important that information about energy use in AI become more transparent and that governments regulate the sector in line with their climate policies.
By Felicity Bradstock for Oilprice.com
More Top Reads From Oilprice.com:
- Oil Prices Claw Back Some Losses But Demand Concerns Remain
- Weak Chinese Economic Data Fuels Oil Demand Concerns
- OPEC’s Smallest Producer Sees Crude Oil Exports Drop To Zero