A new breakthrough in tech may have just solved artificial intelligence’s energy problems. AI requires a massive and growing amount of energy, and is producing more and more greenhouse gas emissions as the sector continues to expand. The issue has been the subject of increasing attention and anxiety in recent months, but those worries could soon be a thing of the past thanks to a new kind of analogue computer chip developed by IBM research.
The analogue chip is capable of running an AI speech recognition model 14 times more efficiently than a standard computer chip. The analogue chip is a compute-in-memory (CiM) model, which means that it’s able to performs calculation directly within its own memory instead of sending information back and forth millions of times to recall or store data in external memory chips, thereby relieving a significant bottleneck currently plaguing AI operations.
“IBM’s device contains 35 million so-called phase-change memory cells – a form of CiM – that can be set to one of two states, like transistors in computer chips, but also to varying degrees between them,” New Scientist reported this week. This is a huge breakthrough for computing, as “these varied states can be used to represent the synaptic weights between artificial neurons in a neural network, a type of AI that models the way that links between neurons in human brains vary in strength when learning new information or skills, something that is traditionally stored as a digital value in computer memory.” Thanks to this innovation, the analogue chip is able to store and process weights with just a fraction of the computing effort typically required.
The efficiency of these new analogue chips could not only solve AI’s energy use issue, but also its chip-use issue. Training AI programs can require an enormous amount of computer chips, with thousands sometimes used on a single project. This, too, has become an issue due to a worldwide computer chip shortage coupled with a boom in new AI ventures. In particular, AI companies are having unprecedented difficulties when trying to secure a type of chip known as a graphics processing unit, or GPU, as it has (until now) been the most efficient form of chip for AI’s processing needs. The shortage has left startups and smaller companies “scrambling” and taking “desperate measures” to secure the essential chips, a recent New York Times article reported.
This kind of energy-saving innovation for AI can’t come fast enough. The scale of the present-day energy needs for machine learning is enormous, and growing at a breakneck pace. The sector’s energy use has grew 100-fold between 2012 and 2021, and has dramatically spiked since ChatGPT hit the market and spurred and AI gold rush. And most of that energy is derived from fossil fuels. Already, the overall carbon footprint of Artificial Intelligence is almost as large as that of Bitcoin – meaning it’s equivalent to the carbon footprint of some developed nations. “Currently, the entire IT industry is responsible for around 2 percent of global CO2 emissions,” Science Alert recently reported. But AI is on track to blow those numbers out of the water. Consulting firm Gartner projects that in a business-as-usual scenario, the AI sector alone will consume 3.5 percent of global electricity by 2030.
It has been estimated that the training process for GPT-3, which later evolved into ChatGPT, required around 1,287 megawatt hours of electricity and a whopping 10,000 computer chips. To put this in perspective, that amount of energy could power about 121 homes in the United States for an entire year – and produce around 550 tonnes of carbon dioxide in the process. At present, experts calculate that Open.AI, the creators of ChatGPT, are likely spending approximately US$700,000 per day just on computing costs in order to provide the chatbot’s services to 100 million users worldwide.
By Haley Zaremba for Oilprice.com
- U.S. And China Top The Chart In Global Oil Consumption
- Mongolia Set To Become Key Player In U.S. Rare Earth Strategy
- Ruble Plunge Sends Waves Across Central Asia's Economies