Posted: 12 Minute read

Google SGE & AI: Rhetoric Meets Reality

What you’re about to read is a story that you’re unlikely to see in the mainstream marketing press. On the face of things, Google’s new AI-powered Search Generative Experience seemingly represents a brave new world of search. But, there’s quite a few things its advocates overlook…

What is Google Search Generative Experience?

Face it, if you’re in any way tangentially involved in digital marketing, you’ve undoubtedly seen the animated screenshots and sleek YouTube videos hailing the arrival of Google Search Generative Experience (or merely Google SGE).

Touted as a ‘new way to experience the web’, SGE was hurriedly ushered into the world following the tidal wave of hype that surrounded the launch of Open AI’s ChatGPT in November 2022. 

Having hyped its own AI efforts for several years, Google was clearly spooked by the immediate ‘success’ of ChatGPT (if success is measured in numbers of account activations).

In what appears to be a case study in ‘drinking your own Kool Aid’, Google sought to wrest back control of the AI narrative by releasing its own AI-powered search experience.

However, roll out of SGE has not been immediate (for reasons that we’ll soon delve into). In fact, to date, SGE remains unavailable in several major search markets (e.g. here in the UK). 

Anyway, enough of the preamble… 

What exactly is Google SGE beyond the rather vague advertising slogan of ‘AI-powered search’? 

Typically, in a piece of reporting, you’d go to the ‘primary source’ for a quote - in this case Google. 

But, Google’s staggeringly obtuse press release about SGE does little beyond repeating the ‘AI-powered search’ mantra ad nauseam. The closest we get to an elucidation of SGE is that it will ‘provide you with more information and context to your searches’. 

It would be perhaps a little unkind of me to suggest that this is a ‘nothing burger’, but that’s what it appears to be. 

In other words, Google appears to have bolted on Bard - its pale imitation of ChatGPT - to its hugely successful search business. A.K.A. Google has just integrated ‘Clippy’ into the SERPs. 

If we’re a little less harsh, and simply apply the reductio-ad-absurdum method of logical enquiry, Google SGE appears to be little more than the addition of a textual summary at the top of a search result. 

Is this handy? Admittedly, for certain search results, yes.

However, Google was already doing this for many queries (hi, featured snippets!), and seems to have just added an extra degree of needless complexity (for Google, not for us, the humble users). You see, at least featured snippets appeared to have been qualified by content evaluators etc. Whereas AI generated summaries appear to be ‘loose cannons’, playing fast and loose with the truth. 

However, despite prevaricating at length so far, the real point of this article isn’t to discuss the objective merits of Google Generative Search Experience. 

It may be the best thing in the world, but, as I’m about to contend, it may not be as big of a deal as the digital marketing commentariat is making out right now.

Here’s why…

Rolling out Google Generative Search Experience 

Digital marketing (and the internet in general) has a dirty little secret. Energy. 

Or, to be more precise, its gargantuan thirst for energy.

Even before the advent of artificial intelligence, Google had a voracious appetite for electricity. 

It uses country-sized quantities of energy each year. 

Yes, we’re facing unprecedented impacts as a result of climate change. Countries are having to go to massive lengths to curtail their energy consumption. 

Yet, that online platform you use to search for cat memes and do the shopping is using immense amounts of electricity. As energy-industry commentator Robert Bryce has written, if Google were a country, its electricity use would rank among the top 90 countries in the world. Whatsmore, the company’s electricity use is doubling every three years (approximately). 

And that’s for Google as is. 

However, we appreciate we’re talking in generalities here. How about some context? 

According to Statista, Alphabet (Google’s parent company) used 15.4 GWh (Gigawatt hours) of energy in 2020. That is the same or more energy than the following countries

  • Sri Lanka. 
  • Angola.
  • Slovenia. 
  • Uruguay. 
  • Lithuania. 
  • Costa Rica. 
  • Estonia. 
  • Albania. 
  • Luxembourg.
  • Jamaica. 
  • Malta.

The list goes on…

But, the thing is, add in AI-powered search to the mix and things get considerably more startling.

The energy consumption of artificial intelligence

Okay, now we get the crux of my article. In fact, if there’s only one part of this article you’re going to remember (or screenshot and laugh about on Twitter or LinkedIn (although, other social networks are available)), it’s this part. 

A clever chap by the name of Alex De Vries, has crunched the numbers on integrating AI into Google search. 

In his paper, The growing energy footprint of artificial intelligence, published in Joule, has found that both the training stages and inference stages of the LLMs (Large Language Models) that underpin AI chat bots such as ChatGPT, use vast amounts of energy.

How much? 

It’s difficult to gain a 100% accurate picture because the tech companies behind many of the latest AI innovations are being decidedly coy (aka, not transparent in the slightest) about their energy usage. 

Some clever bods have done some digging though, and have come up with some realistic looking figures. 

Let’s first consider the ‘training stage’ of an LLM. ChatGPT 3 reportedly consumed 1,287 MWh of electricity during its training stage. 


Unit of Measure Equivalents for Electricity1

Unit

Equivalent

Kilowatt (kW)

1,000 (One Thousand) Watts

Megawatt (MW)

1,000,000 (One Million) Watts

Gigawatt (GW)

1,000,000,000 (One Billion) Watts

Terawatt (TW)

1,000,000,000,000 (One Trillion) Watts

Kilowatt-hours (kWh)

1,000 (One Thousand) Watthours

Megawatt-hours (MWh)

1,000,000 (One Million) Watthours

Gigawatt-hours (GWh)

1,000,000,000 (One Billion) Watthours

Terawatt-hours (TWh)

1,000,000,000,000 (One Trillion) Watthours


Note the distinction between Watts and Watthours in the table above. In simple terms, Watts measures the amount of power a device requires to operate. Watthours is a measure of the energy consumed by that device over an hour.

That’s enough to provide 390 UK homes with electricity for an entire year (based on the UK average household using 3,300 kWhs of electricity each year). Or, to look at it another way, enough electricity to power 2,000 UK homes for an hour2.

And, that’s just the training phase. 

Next, let’s consider the inference phase. In LLMs such as ChatGPT this inference phase refers to the LLM providing live responses to user queries. 

Although less attention has been paid to the energy usage of AIs during inference, it appears to be even greater than the training stage. 

Research firm SemiAnalysis has suggested that running ChatGPT requires 3,617 of NVIDIA’s HGX A100 servers, with a total of 28,936 graphics cards. The energy demand of these? A truly colossal 564 MWh per day…

Again, let’s provide some real-word context. 

That’s enough to provide electricity to 170 UK homes for a year (again, based on Ofgem’s calculation that the average UK home uses 3,300 kWhs of electricity per year). 

Let me just restate that; ChatGPT is potentially using the same amount of electricity every day as 170 UK homes use in an entire year.

Scale that up - and as Alex De Vries has predicted - AI technology could be on track to annually consume as much energy as the entire country of Ireland (29.3 terawatt-hours per year)3.

That. Is. Ridiculous.

Bear in mind that the humble human brain (so derided by AI ‘gurus’), uses a mere 12kWh of energy per month (on the basis of a 2kWh/day consumption for human beings, 20% of which is dedicated to brain activity)4

Remember that the above figures only loosely apply to ChatGPT and don’t take into account the slew of other LLMs and AI-powered applications that will inevitably emerge over the next year or more. 

As a recent report from the International Energy Agency (IEA) outlined, the estimated global energy demand of data centres, AI and cryptocurrencies could more than double by 2026, increasing from 460 TWh in 2022 to as much as 1,050 TWh. This demand is roughly equivalent to the total electricity consumption of Japan.

BuT wHaT AbOuT EnErGy EfFiCiEnCy?!

Of course, the natural answer to this is energy efficiency.

“If we just make LLMs more energy efficient, then that’ll solve the problem” I can hear the AI advocates shout. 

And, to an extent, they’re correct. 

It’s become almost a law of nature that any technology will become more energy efficient over time.

And, indeed, Google is busily working away on this problem. As a recently filed patent revealed, Google is attempting to more efficiently allocate computing resources when training an LLM.

Given the stakes (and sheer sums of brain power and money involved) I suspect Google will succeed in reducing the energy consumption of the LLMs underpinning its Search Generative Experience. 

But, even if Google does succeed in making LLMs more energy efficient, it will face an issue first summarised by William Stanley Jevons in 1865. 

Born right here in Liverpool, Jevons was tasked by the government at the time with investigating Britain’s reliance on coal (which at the time was the country’s primary energy source). 

His findings had implications which continue to resonate today. 

In short, he surmised that increased energy efficiency inexorably leads to an overall increase in energy consumption. 

Sounds counterintuitive right? Well, that’s why it's a paradox…

But, to make things clearer, let’s take a look at a tangible example. Imagine you have an indoor tomato farm. You have an acre of space which requires heating in order to grow the tomatoes. It costs you £100 per day in electricity to provide that heat. 

Suddenly a clever inventor discovers a way to make heating technology that is 50% more efficient. Brilliant! It now only costs you £50 per day to heat that acre of tomatoes. 

The thing is… what do you do with that extra £50 that’s now burning a hole in your pocket?

Well, thanks mainly to the vagaries of capitalism, it makes sense to grow two acres for £100. You’ve just cancelled out that gain in energy efficiency. 

You can see this in action in the digital space right now. Consider the slew of new apps and devices that ‘feature AI’. Surely, it won’t be long before we’re all using AI-powered petrol pumps and LLM-equipped refrigerators. 

Sure, LLMs may become less energy intensive in themselves, but as per old Jevons, the total number of LLMs (and their concomitant energy use) is undoubtedly set to skyrocket.

Paying for the Generative Search Experience

Okay, so it’s my contention that if Google wants to roll out its Generative Search Experience in any meaningful way, it’s going to incur significant energy costs.

According to estimates by global investment firm Morgan Stanley, ‘Google’s 3.3 trillion search queries last year (2022) cost roughly a fifth of a cent each’. 

What does that number look like if an AI-powered search experience rolls out? Again, back to Morgan Stanley: ‘Google… could face a $6 billion hike in expenses by 2024 if ChatGPT-like AI were to handle half the queries it receives with 50-word answers’ (emphasis mine). 

However, the fun doesn’t end there. 

All of those AI-powered searches won’t just require gobs of electricity - but costly hardware, too. 

As per de Vries, ‘SemiAnalysis estimated that implementing AI similar to ChatGPT in each Google search would require 512,821 of NVIDIA’s A100 HGX servers, totalling 4,102,568 GPUs’.

I’m sure you don’t need me to point out that that’s a bleedin’ enormous number of servers. 

Oh, and the cost of all these servers? A total investment cost of $100 billion5.

When you consider that Google Search generated revenues (that’s revenues, not profits) of $162.5 billion in 2022, it’s clear that the sums don’t exactly add up. 

Bear in mind that the aforementioned $100 billion isn’t a ‘one time’ cost either. Servers are subject to the same laws of entropy as everything else, and thus will need replacing over time. 

Factor in the cost of all this hardware, and its exorbitant electricity usage and as John Hennessy (chairman of Google’s parent company Alphabet) admitted to Reuters, AI-powered search could “cost 10 times more” than using traditional search tools (a.k.a. traditional Google). 

On the face of it, Google is dangling its Generative Search Experience in front of an eager audience of digital marketing professionals, but may not actually be able to deliver it in any meaningful way (at least not on an all-query, search-wide basis). 

The embodied energy problem

So, what have we learnt so far? That:

  • AI uses enormous amounts of energy, both in its training and inference phases.

  • Thanks to Jevons Paradox (and the frankly cosmic ambitions of AI hucksters), any breakthroughs in AI energy efficiency are likely to be cancelled out by the increased utilisation and deployment of AI.

  • The cost of sufficiently powerful hardware to run an AI-powered search is enormous. So much so, that it may be unfeasible for Google to roll out its Generative Search Experience in any meaningful way. 

But, there’s another issue at stake. One that is widely overlooked, even amongst the more environmentally-conscious marketers out there - the embodied energy issue. 

In addition to consuming ladles of energy during operation, AI servers also require vast amounts of energy in their manufacture. 

As per technology journalist Kris De Decker, ‘the energy used to produce electronic gadgets is considerably higher than the energy used during their operation… A handful of microchips can have as much embodied energy as a car’6

De Decker makes the incisive point that it’s incredibly difficult to calculate the embodied energy of an electronic device: 

‘A life cycle analysis of high-tech products is extremely complex and can take many years, due to the large amount of parts, materials and processing techniques involved. In the meantime, products and processing technologies keep evolving, with the result that most life cycle analyses are simply outdated when they are published’. 

This is all a way of saying that it’ll be pretty much impossible, at the present time, to calculate the embodied energy of one of NVIDIA’s A100 HGX servers - but, you can guarantee that it’s vast.

To provide a degree of context, consider that ‘the embodied energy of the memory chip alone already exceeds the energy consumption of a laptop during its life expectancy of three years’. 

In other words, the energy used to manufacture said laptop’s memory chip is more than the laptop will use during its entire service life. 

That’s why the embodied energy issue is so significant; electronic devices have a hidden ‘energy debt’ that’s considerably greater than the energy they’ll consume during their use phase. 

AI’s water problem

Aside from needing to wolf down gobs of electricity, AI has a significant appetite for water. 

Those data centres, servers and GPUs need considerable cooling after all. 

2022 saw Google report that it used 5.2 billion gallons of water in its data centres - a jump of 20% from 2021. Microsoft’s data centres also have an equivalent thirst, with the company saying that its data centres used nearly 1.7 billion gallons of water in 2022 - a leap of 34% from 2021.

As researcher Shaolei Ren said to Fortune magazine, “it’s fair to say that the majority of the growth is due to AI’. 

Further, Ren and her team have estimated that ChatGPT uses up to 500 millilitres of water every time you ask it between five to 50 questions (or ‘prompts’). Multiply that across the estimated 10 million queries ChatGPT handles each day and that’s a veritable tidal wave of fresh, potable water being used. 

All that water usage is already having a real-world impact. Consider that in The Dalles - a city in the state of Oregon - it was revealed that a Google data centre accounted for 29% of the entire city’s water usage!

Likewise, a community in Northern Holland (which was struck by a drought in 2022) discovered that a Microsoft data centre was using vastly more water than it had originally predicted. Both Microsoft, and the local municipality of Hollands-Kroon, had said that the data centre in Middenmeer would use between 12 million to 20 million litres of water per year. 

However, local news outlet Noordhollands Dagblad discovered that the site had in fact used 84 million litres!. 

AI - and by extension the potential Google Search Generative Experience - will likely only exacerbate the thirst of the world’s data centres…

Conclusion

I appreciate that this has been a somewhat meandering article that is, in places, only tangentially related to AI-powered internet search. 

However, I hope it makes a worthwhile point. 

Yes, I suspect Google will roll out its Generative Search Experience in some form, but it’s unlikely to be as widespread or as impactful as many digital marketers are currently predicting. 

What’s more concerning, from my point of view at least, is the titanic amounts of energy that AI-based search would consume. Is it really worth ramping up electricity consumption to even higher levels for the sake of a slightly more flashy, convenient form of search?!

Listen to the rhetoric of the AI evangelists, and you’ll be led to believe that such vast consumption of energy is worth it as AI will immanentise the eschaton. 

Whether the AI hype will match the reality remains to be seen. But, that’s not stopping Silicon Valley from pressing ahead with their plans to rollout AI (whether you like it or not).

We know that our (self-appointed) ‘tech overlords’ are aware of AI’s energy issue. Consider Microsoft. Wanting to imbue nearly every one of their products with some form of AI functionality, they are considering using next-generation nuclear reactors to power its data centres. 

The doyen of the modern AI movement, Sam Altman, has also recognised AI’s voracious appetite for energy. Speaking at Davos, Altman admitted that the continued development of AI will require an ‘energy breakthrough’. It’s probably for that reason that he has personally invested $375 million in private U.S. energy company Helion Energy

So, next time you hear about Google’s Generative Search Experience (or any other AI-powered ‘breakthrough’ for that matter), think about the largely untold story of its energy use and concomitant environmental impact…

References

1. U.S. Energy Information Administration. Unit of Measure Equivalents for Electricity. Available at: https://www.eia.gov/electricity/annual/html/epa_a_05.html (Accessed on 28th January 2024).

2. Ofgem. Electricity generation: facts and figures. Available at: https://www.ofgem.gov.uk/sites/default/files/docs/2006/04/13537-elecgenfactsfs.pdf (Accessed on 28th January 2024).

3. IEEE Spectrum. Generative AI’s Energy Problem Today Is Foundational: Before AI can take over, it will need to find a new approach to energy. Available at: https://spectrum.ieee.org/ai-energy-consumption (Accessed on 28th January 2024).

4. Couillet R, Trystam D, Menissier T. The submerged part of the AI-ceberg. Available at: https://ieeexplore.ieee.org/document/9869565 (Accessed on 28th January 2024).

5. SemiAnalysis. (2023). The Inference Cost of Search Disruption -Large Language Model Cost Analysis. Available at: https://www.semianalysis.com/p/the-inference-cost-of-search-disruption (Accessed on 28th January 2024).

6. LowTechMagazine. (2009). The Monster Footprint of Digital Technology. Available at: https://solar.lowtechmagazine.com/2009/06/the-monster-footprint-of-digital-technology/ (Accessed on 28th January 2024).

Written by:

Photo of Matt Donnelly
Matt Donnelly
Senior Content Manager
Matt has been a content writer for over 10 years and is a passionate advocate of the written word. He specialises in creating technical content on a variety of scientific topics, and enjoys communicating complex ideas with clarity. Matt was previously a staff writer for an engineering magazine and keenly follows contemporary environmental issues. Connect with Matt on LinkedIn.

Resources

The Hub