Green AI tries to establish the connection between compute power and carbon emissions, so that the carbon costs of massive AI computation demands can be managed to some degree. Three things that can be controlled with the intent of reducing the AI carbon footprint: the design of AI algorithms, the hardware on which it is trained, and the availability of green electricity to the data centers in which they run, according to a recent account in sifted.
Large companies such as Google have the resources to control all three, but the average company adopting AI does not. “It is therefore imperative that those developing the AI make sure they consider sustainability throughout the design and training process,” stated the author of the account, Brian Mullins, CEO of the AI startup Mind Foundry, a spin-out from Oxford University.
“The challenge for AI startups, and those looking to procure their services, is to strike a balance between providing business with the benefits of this tech and its long term impact on the environment,” Mullins stated.
Sizing the issue, The European Union is warning that greenhouse gas emissions attributed to the information and communications technology (ICT) industry today account for two percent of all emissions—as much as all air traffic—and would rise if unchecked to 14% of global emissions within 20 years.
AI itself can be put to good use in green projects. The UK startup Space Intelligence, for example, is applying machine learning and AI to satellite data to address environmental concerns, such as reforestation, so that industry can take corrective steps.
“We are at the point where business leaders must take a hands-on approach in making sure the AI they adopt has taken these considerations into account—and those developing these technologies must be able to deliver on those concerns,” stated Mullins. “An understanding of the costs of building and deploying AI models both financially and in terms of environmental impact must be developed across the board.”
Google meanwhile, has made a commitment to be carbon-free by 2030. At the recent Google I/O 2021 event, the company elaborated on this effort, announcing a plan to use geothermal energy to power its data centers in Nevada. “We are the first corporate to sign an agreement to develop next-generation geothermal,” stated Kate Brandt, Google’s sustainability officer, in an interview with c|net prior to the event.
The agreement is with Fervo Energy, a clean energy startup aiming to “unlock” the potential of geothermal resources. Starting next year, Fervo will work with Google to begin to add geothermal energy to Nevada’s grid. It would complement existing renewable energy sources in the area, helping to push Google towards its 24/7 goals.
Geothermal power plants use heat from the Earth as an energy source. Deep below the planet’s surface, radioactive material is decaying and magma is heating up water, which provides an almost limitless supply of energy, if it can be accessed.
“It is a phenomenally great source of energy,” stated Graham Heinson, a geophysicist at the University of Adelaide. “And there is a huge amount of trapped heat in the Earth that could power enormous parts of the world.”
Geothermal would be part of Google’s energy mix, along with wind and solar. Geothermal is always available, whereas wind and solar are not. Fervo, which received funding from the Breakthrough Energy fund backed by Bill Gates in 2018, is working on an Enhanced Geothermal System (EGS). This would send water into the earth and bring heated water back up, an improvement on current methods. The companies expect that working together will lead to some advances.
“We’re combining our novel AI with some drilling techniques and fiber-optic sensing,” stated Brandt. She noted the challenge facing Google, since many locations operate with “dirty energy,” using fossil fuels to power the grid. That is true in Sydney, Australia, where Brandt is based, and where coal dominates the energy mix.
“Unless there are changes in policy or procurement in my region, the products I use to access Google’s services won’t be running on clean energy,” she stated. “This is going to be hard.”
One team of researchers has developed an emissions calculator to estimate the energy use and environmental impact of training machine learning models. AI researcher Alexandra Luccioni of the University of Montreal and Mila, collaborated on that study. “This is definitely something that people are working on, be it via more efficient GPUs or by guying renewable energy credits for the carbon that was produced by neural network training,” she stated in a recent account in Nature Machine Intelligence. “Using renewable energy grids for training neural networks is the single biggest change that can be made. It can make emissions vary by a factor of 40, between a fully renewable grid and a fully coal grid,” she stated.
Progress will require researchers to divulge how much carbon dioxide was produced by their research, as well as to reuse models instead of training from scratch and by using more efficient GPUs, she suggested.
Luccioni has ideas for how to push the green AI effort forward. “More tax incentives should be given for cloud providers to open data centers in places with hydro or solar energy,” she stated. “For example, in Quebec, we have a very low-carbon grid that relies mostly on hydro, plus with the cold winters, the heat generated by computing centers can be used to heat homes. If companies had a big incentive to build their data centers there and not in, say, Texas, where the grid is mostly coal-fueled, it could make a massive impact.”
One scientist suggested less AI might be better. Deepika Sandeep, an AI scientist who heads the AI and ML program at Bharat Light & Power (BLP), a clean energy generation company based in Bangalore, India, feels that judicious use of deep learning needs to be enforced. “Not every problem demands a machine learning-based solution,” he stated.
Many startups are targeting a more energy-efficient AI. A list of 42 was recently published at AI Startups.
Comments