Search Site

Trends banner

‘Wadeem’ sold out for $1.49bn

This is the highest Abu Dhabi real-estate release to date.

Tesla Q2 sales down 13.5%

Shares rally after the disclosure, better than some forecasts.

TomTom cuts 300 jobs

The firm said it was realigning its organization as it embraces AI.

Aldar nets $953m in sales at Fahid

Aldar said 42 percent of the buyers are under the age of 45.

Qualcomm to Alphawave for $2.4 bn

The deal makes Alphawave the latest tech company to depart London.

AI needs to be smaller, reduce energy footprint: UNESCO study

Each request sent to ChatGPT consumes on average 0.34 Wh of electricity, which is between 10 and 70 times a Google search. (AFP)
  • OpenAI CEO Sam Altman recently said that each request sent to ChatGPT consumes on average 0.34 Wh of electricity, which is 10-70 times a Google search.
  • UNESCO calculated that AI energy demand is doubling every 100 days as generative AI tools become embedded in everyday life.

Paris, France — The potential of artificial intelligence is immense — but its equally vast energy consumption needs curbing, with asking shorter questions one way to achieve, said a UNESCO study unveiled Tuesday.

A combination of shorter queries and using more specific models and could cut AI energy consumption by up to 90 percent without sacrificing performance, said UNESCO in a report published to mark the AI for Good global summit in Geneva.

OpenAI CEO Sam Altman recently revealed that each request sent to its popular generative AI app ChatGPT consumes on average 0.34 Wh of electricity, which is between 10 and 70 times a Google search.

With ChatGPT receiving around a billion requests per day that amounts to 310 GWh annually, equivalent to the annual electricity consumption of three million people in Ethiopia, for example,

Moreover, UNESCO calculated that AI energy demand is doubling every 100 days as generative AI tools become embedded in everyday life.

“The exponential growth in computational power needed to run these models is placing increasing strain on global energy systems, water resources, and critical minerals, raising concerns about environmental sustainability, equitable access, and competition over limited resources,” the UNESCO report warned.

However, it was able to achieve a nearly 90 percent reduction in electricity usage by reducing the length of its query, or prompt, as well as by using a smaller AI, without a drop in performance.

Many AI models like ChatGPT are general-purpose models designed to respond on a wide variety of topics, meaning that it must sift through an immense volume of information to formulate and evaluate responses.

The use of smaller, specialised AI models offers major reductions in electricity needed to produce a response.

So did cutting the cutting prompts from 300 to 150 words.

Being already aware of the energy issue, tech giants all now offer miniature versions with fewer parameters of their respective large language models.

For example, Google sells Gemma, Microsoft has Phi-3, and OpenAI has GPT-4o mini. French AI companies have done likewise, for instance, Mistral AI has introduced its model Ministral.