Advances in Decentralized Learning and Optimization

The field of decentralized learning and optimization is moving towards more efficient and privacy-preserving methods. Researchers are exploring new approaches to address the challenges of distributed optimization, such as stepsize heterogeneity and communication bottlenecks. Decentralized federated learning is gaining attention, with a focus on designing effective frameworks that can handle dynamic changes in topology and resource heterogeneity. Energy efficiency and model performance are also key considerations in the development of these frameworks. Noteworthy papers in this area include: FedMeNF, which proposes a novel privacy-preserving federated meta-learning approach for neural fields. AerialDB, which presents a lightweight decentralized data storage and query system for drone fleets. Hat-DFed, which introduces a heterogeneity-aware and energy-efficient decentralized federated learning framework. These papers demonstrate significant advancements in the field, with potential applications in various areas, including edge computing, Internet of Vehicles, and drone fleets.

Sources

Distributed Optimization and Learning for Automated Stepsize Selection with Finite Time Coordination

FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields

Topology Generation of UAV Covert Communication Networks: A Graph Diffusion Approach with Incentive Mechanism

Energy Efficient Task Offloading in UAV-Enabled MEC Using a Fully Decentralized Deep Reinforcement Learning Approach

AerialDB: A Federated Peer-to-Peer Spatio-temporal Edge Datastore for Drone Fleets

Towards Heterogeneity-Aware and Energy-Efficient Topology Optimization for Decentralized Federated Learning in Edge Environment

Distributed optimization: designed for federated learning

Decentralized Rank Scheduling for Energy-Constrained Multi-Task Federated Fine-Tuning in Edge-Assisted IoV Networks

Built with on top of