Advances in Energy-Efficient AI and Federated Learning

The field of artificial intelligence is moving towards more energy-efficient and privacy-preserving solutions. Recent developments in federated learning and large language models have shown promising results in reducing energy consumption and improving model performance. Researchers are exploring new architectures and techniques to optimize energy efficiency, such as targeted optimizations to transformer attention and MLP layers, and fine-grained empirical analysis of inference energy across core components of transformer architecture. Noteworthy papers include Litespark Technical Report, which introduces a novel pre-training framework that achieves substantial performance gains and energy consumption reduction, and Dissecting Transformers: A CLEAR Perspective towards Green AI, which presents a fine-grained empirical analysis of inference energy across core components of transformer architecture. Other notable works include FTTE: Federated Learning on Resource-Constrained Devices, Edge-FIT: Federated Instruction Tuning of Quantized LLMs for Privacy-Preserving Smart Home Environments, and CAFL-L: Constraint-Aware Federated Learning with Lagrangian Dual Optimization for On-Device Language Models, which all contribute to the advancement of energy-efficient and privacy-preserving AI solutions.

Sources

Litespark Technical Report: High-Throughput, Energy-Efficient LLM Training Framework

Dissecting Transformers: A CLEAR Perspective towards Green AI

Energy Efficiency in Cloud-Based Big Data Processing for Earth Observation: Gap Analysis and Future Directions

FTTE: Federated Learning on Resource-Constrained Devices

Edge-FIT: Federated Instruction Tuning of Quantized LLMs for Privacy-Preserving Smart Home Environments

CAFL-L: Constraint-Aware Federated Learning with Lagrangian Dual Optimization for On-Device Language Models

SVDefense: Effective Defense against Gradient Inversion Attacks via Singular Value Decomposition

Intelligent Healthcare Ecosystems: Optimizing the Iron Triangle of Healthcare (Access, Cost, Quality)

Distributed Low-Communication Training with Decoupled Momentum Optimization

A Lightweight Federated Learning Approach for Privacy-Preserving Botnet Detection in IoT

Towards Carbon-Aware Container Orchestration: Predicting Workload Energy Consumption with Federated Learning

FedSRD: Sparsify-Reconstruct-Decompose for Communication-Efficient Federated Large Language Models Fine-Tuning

Federated Learning for Surgical Vision in Appendicitis Classification: Results of the FedSurg EndoVis 2024 Challenge

Beyond Static Knowledge Messengers: Towards Adaptive, Fair, and Scalable Federated Learning for Medical AI

Built with on top of