Sustainable AI: Quantifying Climate Risk and Energy Efficiency

The field of Artificial Intelligence (AI) is rapidly expanding, with a growing focus on sustainability and energy efficiency. Recent developments have highlighted the need to quantify the climate risk associated with AI systems, particularly those using large language models (LLMs). Researchers are exploring new methods to estimate the carbon footprint of LLMs, including the development of frameworks such as G-TRACE and CO2-Meter. These tools enable the measurement of energy consumption and carbon emissions associated with LLM training and inference, providing valuable insights for sustainable AI deployment. Furthermore, studies have investigated the performance of LLMs on edge devices, demonstrating the potential for localized, privacy-preserving inference. The use of small LLMs and local accelerators has been shown to achieve competitive performance while reducing energy consumption. Noteworthy papers in this area include the introduction of the AI Sustainability Pyramid, a governance model for sustainable AI deployment, and the development of the BRACE framework for benchmarking LLMs on energy efficiency and functional correctness. Additionally, the proposal of intelligence per watt (IPW) as a metric for assessing local inference capability and efficiency has significant implications for the future of AI development.

Sources

Quantifying the Climate Risk of Generative AI: Region-Aware Carbon Accounting with G-TRACE and the AI Sustainability Pyramid

An Evaluation of LLMs Inference on Popular Single-board Computers

Agentic Educational Content Generation for African Languages on Edge Devices

Analysing Environmental Efficiency in AI for X-Ray Diagnosis

Dynamic Stability of LLM-Generated Code

Smart but Costly? Benchmarking LLMs on Functional Accuracy and Energy Efficiency

Intelligence per Watt: Measuring Intelligence Efficiency of Local AI

A robust methodology for long-term sustainability evaluation of Machine Learning models

CO2-Meter: A Comprehensive Carbon Footprint Estimator for LLMs on Edge Devices

Energy Consumption of Dataframe Libraries for End-to-End Deep Learning Pipelines:A Comparative Analysis

Built with on top of