Google’s carbon emissions have soared by 51% since 2019 as artificial intelligence hampers the tech company’s efforts to go green.
While the corporation has invested in renewable energy and carbon removal technology, it has failed to curb its scope 3 emissions, which are those further down the supply chain, and are in large part influenced by a growth in datacentre capacity required to power artificial intelligence.
The company reported a 27% increase in year-on-year electricity consumption as it struggles to decarbonise as quickly as its energy needs increase.
Datacentres play a crucial role in training and operating the models that underpin AI models such as Google’s Gemini and OpenAI’s GPT-4, which powers the ChatGPT chatbot. The International Energy Agency estimates that datacentres’ total electricity consumption could double from 2022 levels to 1,000TWh (terawatt hours) in 2026, approximately Japan’s level of electricity demand. AI will result in datacentres using 4.5% of global energy generation by 2030, according to calculations by the research firm SemiAnalysis.
don’tbe evilBut it’s critically necessary for functionality.
Got a question? Google it an boom there’s an AI summary for you. Now you’re engaged in scrolling past the dubious response and the sponsored links before you can get to the results you want.
It’s called ‘enhancing the user experience’. It was tedious to ignore the paid ads where you were likely to be misled for profit, but now it’s enhanced tedium where you’re likely to be misled for no fucking reason.
If Google had built nucler power plants 10-years ago, there would be zero emissions. If California had done it instead there would also be zero-emissions, if the federal government had built nuclear power plants we’d also be at zero emissions. If all anti-nuclear people had killed themselves in 1979 we’d be net negative with emissions.
Practically unlimited demand is fine if the source doesn’t use fossil fuels to begin with, so I don’t see how this is an “AI” problem. It is, of course, a capitalism problem though.
They just heating up the oven to bake people alive.
ah, but on the flip side, ai can conjure up an email summary within seconds that can shave off up to 5 whole minutes from someone’s extremely busy day.
surely that’s adequate recompense for all that energy spent?
conjure up an email summary within seconds that can shave off up to 5 whole minutes
… but can it? Like actually, can one do that?
Sure an LLM can generate something akin to a summary. It will look like it’s getting some of the points in a different form… but did it get the actual gist of it? Did it skip anything meaningful that once ignore will have far reaching consequences?
So yes sure an LLM can generate shorter text related to what was said during the meeting but if there is limited confidence in the accuracy and no responsibility, unlike somebody who would take notes and summarize potentially facing negative consequences, then wouldn’t the reliance on such a tool create more risk?
We’re cooked
I’ve been assured that AGI is right around the corner and will solve climate change (in a way that is economically palatable to the rich and powerful)
Starting to look like the biggest f*cking corner humanity ever faced.