by: Eko Prasetyo
The use of generative artificial intelligence (AI), an advanced technology currently gaining popularity in chatbots and image creation, raises questions about its impact on the Earth’s environment.
Boston University’s Associate Professor of Computer Science Kate Saenko expressed her concern about the cost of energy required to build such artificial intelligence models. The smarter the AI, the more energy it consumes. What are the implications for the carbon footprint of using generative AI in the future?
AI capabilities and resources required
The ability of generative AI to generate complex data is the main point. Unlike discriminative AI which produces outputs based on the options available, such as rejecting or accepting a credit application, generative AI is capable of producing more complex outputs such as sentences, paragraphs, images, and even short videos. This technology has been used in smart speaker applications to provide voice responses or autocomplete features to recommend searches.
However, the popularity of generative AI is increasing as the technology is now capable of producing human-like language content and even creating realistic photos.
Higher energy consumption is one of the concerns. Currently, there is no exact data on the energy consumption for a single AI model. This energy cost includes the energy used in the creation of computing devices, model creation, and implementation in the production process.
In 2019, researchers found that creating a generative AI called BERT, which uses 110 million parameters, consumes energy equivalent to travelling between continents on an aeroplane. The number of parameters determines the size of the AI model. The larger the number, the smarter the AI.
The researchers estimated that the manufacturing process of GPT-3, the latest AI model developed into ChatGPT, which has 175 billion parameters, consumed 1,287 megawatts of electricity and produced 552 tonnes of CO2. This is equivalent to the emissions from 123 petrol cars operating for one year.
However, model size is not the only factor that affects an AI’s carbon footprint. For example, an open access AI model called BLOOM, which is similar in size to GPT-3, produces a much lower carbon footprint. BLOOM uses only 433 megawatts of electricity and produces emissions equivalent to 30 tonnes of CO2.
Studies conducted by Google show that using AI of similar size but powered by more efficient design models and processors, and using environmentally friendly data centres, can reduce the carbon footprint by a hundred to a thousand times.
Potential future uses of AI
Looking to the future, as larger AI models are developed, the energy consumption required will naturally be higher. However, data on the carbon footprint of a single generative AI query is limited. Some estimates suggest that it could be as much as 4-5 times higher than a single search engine query.
The popularity of AI chatbots and image generators is growing. A few years ago, few people outside of research labs were using models like BERT and GPT. However, everything changed on 30 November 2022, when OpenAI launched ChatGPT.
The latest data shows that by March 2023, ChatGPT had been visited about 1.5 billion times. Two months later, Microsoft also integrated ChatGPT into their search engine, Bing, so that it can be used by anyone.
If the popularity of chatbots is on par with search engines, then the energy consumption in using AI will continue to increase. AI assistance is not just limited to search, but can also be used to write documents, solve maths problems, and even create marketing campaigns.
AI models must also be constantly updated. For example, ChatGPT was only trained with data up to 2021. This means that the AI model does not know what happens after that. The process of constantly updating ChatGPT’s knowledge will also increase the energy consumption required.
There are also people who would rather ask a chatbot for more information than use a search engine. Instead of looking at a page full of links, they can instantly get an answer just like when asking a human – assuming the accuracy of the answer, of course. The speed of obtaining information can balance out the increased energy usage of AI compared to search engines.
When it comes to the future, it is difficult to predict with certainty. However, large generative AI models are likely to remain and be used by people to obtain information.
For example, today a student who needs help in solving a maths problem may directly ask a teacher or friend, or refer to a textbook. In the future, they will likely ask a chatbot. The need for expert knowledge may also apply to legal or health issues.
Solution planning
One large AI model may not directly harm the environment. However, if thousands of companies develop thousands of AI bots for various purposes, and each of those models is used by millions of users, then their energy consumption may become a problem. Hence, more research is needed for generative AI to operate more efficiently.
Utilising renewable energy can also reduce carbon emissions in AI operations by one-third or one-quarter of fossil energy use. One way is to place computing systems in locations that have abundant renewable energy supply, or schedule AI operations in the morning and evening when renewable energy supply is higher.
Ultimately, social pressure may push companies and research labs to disclose the carbon footprint of their AI models-some companies are already doing so. It is hoped that in the future, consumers can use this information to choose an environmentally friendly chatbot.
*The author is a researcher of law and technology
Banner photo: Markus Spiske/pexels.com