
OpenAI along with major players such as Google has generated excitement among people regarding technological advancements, yet there are worries about the substantial amount of resources—like water and electricity—that these systems require to function. Among these technologies, ChatGPT stands out as a widely used chatbot, prompting questions around how much investment in hardware and additional resources OpenAI must allocate to enhance its speed and reliability.
Find out this, Sam Altman recently disclosed public details about the resource consumption of ChatGPT, revealing information on the amount of energy and water required for the AI chatbot to process user inquiries.
ChatGPT Requires Water and Power: But To What Extent?
Altman has expressed his views in a recent blog post. post , providing insights into how AI will develop and ideally become more resource-efficient in the coming years. According to Altman, a single query using ChatGPT consumes approximately 0.34 watt-hours of energy, equivalent to powering a lightbulb for just a few minutes.
This may not seem significant for an individual query, but when you consider the billions of inquiries made daily by AI chatbots, the consumption of electrical power becomes substantial.
He subsequently raises a comparable argument regarding the utilization of water to elicit responses from AI systems. Altman points out that it takes a teaspoon of water for ChatGPT to respond to one inquiry; this might seem inconsequential individually, yet cumulatively, it amounts to an enormous volume of water usage. That said, keep in mind that these statistics haven’t been confirmed, so we should approach them cautiously.
In recent years, analysts and experts have voiced concerns about the environmental impact due to the progression of AI systems. Organizations such as OpenAI hold multibillion-dollar valuations; however, there’s no clear insight into how these entities are procuring essential natural resources, particularly water, to sustain their operations.
Defending himself, Altman argues that the advent of AI will progressively unfold just as other technological transformations have done so over time. However, this gradual transition may take considerable duration, raising worries that access could diminish significantly for everyday applications by then.