ChatGPT’s Energy Consumption Raises Environmental Concerns

Consumes over half a million kilowatt-hours daily—17,000 times more than the average US household.

New York: The exponential growth of Artificial Intelligence (AI) comes with a significant environmental cost, with concerns mounting over its electricity consumption. OpenAI’s widely used generative AI model, ChatGPT, has come under scrutiny for its staggering energy usage. According to The New Yorker, ChatGPT alone devours over half a million kilowatt-hours daily to process approximately 200 million user requests, surpassing the daily energy consumption of an average US household by more than 17 thousand times.

As generative AI becomes more ubiquitous, the strain on energy resources is expected to escalate. Data scientist Alex de Vries underscores this point, highlighting the potential energy consumption if tech giants like Google were to integrate generative AI into their services. His calculations suggest that such integration could result in an annual energy consumption of approximately 29 billion kilowatt-hours, exceeding the electricity usage of entire nations like Kenya, Guatemala, and Croatia.

Read More: Jeff Bezos’ Investment Soars: AI Startup Competes with Google, Doubles in Value

De Vries emphasizes the energy-intensive nature of AI infrastructure, noting that individual AI servers already rival the power consumption of multiple households combined. However, accurately gauging the overall electricity usage of the AI industry remains challenging due to operational variability and a lack of transparency from major tech corporations driving the AI revolution.

Drawing from data provided by Nvidia, a leading AI chip manufacturer, De Vries predicts a substantial increase in AI-related electricity consumption by 2027. Projections estimate an annual consumption range of 85 to 134 terawatt-hours, representing a significant share of global energy usage, potentially reaching half a percent by 2027.

Read More: NASA’s Europa Clipper Mission to Feature Water in 103 Languages in Message Bound for Jupiter’s Moon Europa

In comparison, some of the world’s most energy-intensive businesses fall short of the projected AI consumption. Samsung, for instance, consumes close to 23 terawatt-hours annually, while tech giants like Google and Microsoft utilize approximately 12 and 10 terawatt-hours, respectively, to power their vast data centers, networks, and user devices.

As AI continues its pervasive expansion across diverse industries, mitigating its environmental impact becomes paramount. Developing more energy-efficient AI algorithms and hardware is imperative to strike a balance between technological advancement and sustainability goals.

Recent News