DarkSIDE of AI : Power Hungry process

Intro:

I had the privilege of serving as a panelist, discussing the decision-making process in the design of systems that utilize machine learning/AI to enhance efficiency. During my preparation, I discovered an insightful paper that provide…


This content originally appeared on DEV Community and was authored by Bala Madhusoodhanan

Intro:

I had the privilege of serving as a panelist, discussing the decision-making process in the design of systems that utilize machine learning/AI to enhance efficiency. During my preparation, I discovered an insightful paper that provided valuable clarity. I highly recommend giving this paper a read.

Double Edge Sword - Paradox:

The computational demands of large language models are astonishingly high—some facts about their power consumption are truly staggering. Here are a few that will offer some perspective

1.Training the GPT 3 model with 175B parameter - an energy-intensive process that would have resulted in an additional 503 tonnes of carbon emissions—a stark reminder of the environmental considerations we must balance in the pursuit of AI advancements. To put things into perspective this would have been around 500 flights between NewYork and London

Image description

2.Electricity consumption to train the model - consuming 1280 MWh of electricity—enough to power 320 average detached homes in the UK for an entire year.

Image description

3.Inferencing the LLM model - The daily inference operations of GPT-3 consume 564 MWh of electricity, which is sufficient to supply energy to 140 average UK homes for an entire year. This statistic underscores the significant power requirements of maintaining such advanced AI systems in active use

Image description

Motivation for the Research Paper:
The energy consumption by leading cloud computing providers—Meta, Amazon, Microsoft, and Google—has surged, exceeding previous levels. Inference operations, which are integral to AI models like those used in Google Translate, occur far more often than the training phase, potentially billions of times daily. When we shift our focus from training to inference, we see a different picture, especially regarding models designed for general purposes. While training a single model for various tasks may seem more energy-efficient initially, this advantage may diminish or even become a deficit throughout the model’s operational life. This is due to the extensive amount of inference performed when these models are implemented in consumer services such as chat platforms and web search engines.

Methodology:
The research conducted by the authors encompassed 88 models pertaining to 10 tasks and 30 datasets, covering both natural language processing and computer vision. It examined how factors such as the final task, modality, model size, architecture, and learning approach—whether task-specific or multi-task/multi-purpose—affect energy efficiency. The authors uncovered vast disparities in the energy consumed per inference among different models, modalities, and tasks. The study also highlighted a crucial balance between the advantages of multi-purpose systems and their energy expenditure, along with the related carbon emissions.

Image description

The authors selected the eight most popular models from the HuggingFace Hub, determined by the number of downloads, for all the tasks mentioned. The full list of model identifiers is presented in Table 6 of the Supplementary Materials. For each model, the authors conducted 1,000 inferences for each of the three datasets the model was trained on, as listed in Table 1, utilizing the Transformers library [55]. To ensure the statistical significance of the measurements, each set of inferences was repeated ten times. The inferences were carried out sequentially—without batching—to accurately reflect the variability encountered in real-world model deployment, where batching inputs is often not feasible

The authors conducted tests on a subset of tasks to compare the energy consumption and emissions between multi-purpose generative models and individual task-specific systems. The tasks chosen for this comparison were question answering, text classification, and summarization. These particular tasks were selected because the authors could identify a set of models capable of performing them under a unified architectural framework—a feat not possible for all tasks, especially those involving multiple modalities. The eight models were evaluated in a zero-shot setting, which remained consistent across all models. For example, they used the prompt “Summarize the following text: [text]. Summary:” on the same 1,000 samples as the fine-tuned models. Each experiment was repeated ten times to ensure the statistical significance of the results.

Image description

I hope these insights have captured your curiosity. I strongly encourage you to delve into this compelling research and consider the environmental implications when making design decisions.

Reference:

Estimating the Carbon Footprint of BLOOM - Paper Link

Energy Consumption in UK: Energy Guide UK

Utilisation for inference Google Translate


This content originally appeared on DEV Community and was authored by Bala Madhusoodhanan


Print Share Comment Cite Upload Translate Updates
APA

Bala Madhusoodhanan | Sciencx (2024-06-24T07:49:36+00:00) DarkSIDE of AI : Power Hungry process. Retrieved from https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/

MLA
" » DarkSIDE of AI : Power Hungry process." Bala Madhusoodhanan | Sciencx - Monday June 24, 2024, https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/
HARVARD
Bala Madhusoodhanan | Sciencx Monday June 24, 2024 » DarkSIDE of AI : Power Hungry process., viewed ,<https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/>
VANCOUVER
Bala Madhusoodhanan | Sciencx - » DarkSIDE of AI : Power Hungry process. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/
CHICAGO
" » DarkSIDE of AI : Power Hungry process." Bala Madhusoodhanan | Sciencx - Accessed . https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/
IEEE
" » DarkSIDE of AI : Power Hungry process." Bala Madhusoodhanan | Sciencx [Online]. Available: https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/. [Accessed: ]
rf:citation
» DarkSIDE of AI : Power Hungry process | Bala Madhusoodhanan | Sciencx | https://www.scien.cx/2024/06/24/darkside-of-ai-power-hungry-process/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.