Advancements in Edge Computing and Large Language Models

The field of edge computing and large language models is rapidly evolving, with a focus on optimizing task scheduling, offloading, and resource allocation. Researchers are exploring innovative approaches to improve the efficiency and effectiveness of edge computing systems, including the use of large language models to enhance task offloading and resource allocation. Notably, the integration of large language models with edge computing is enabling the development of more intelligent and adaptive systems. Some noteworthy papers in this area include: Deadline-Aware Joint Task Scheduling and Offloading in Mobile Edge Computing Systems, which presents an optimal job scheduling algorithm with low complexity. Large Language Model-Based Task Offloading and Resource Allocation for Digital Twin Edge Computing Networks, which achieves comparable or superior performance to traditional multi-agent reinforcement learning frameworks. iPLAN: Redefining Indoor Wireless Network Planning Through Large Language Models, which demonstrates superior performance in indoor wireless network planning tasks.

Sources

Deadline-Aware Joint Task Scheduling and Offloading in Mobile Edge Computing Systems

Large Language Model-Based Task Offloading and Resource Allocation for Digital Twin Edge Computing Networks

iPLAN: Redefining Indoor Wireless Network Planning Through Large Language Models

Deep Reinforcement Learning-Based Scheduling for Wi-Fi Multi-Access Point Coordination

MCIF: Multimodal Crosslingual Instruction-Following Benchmark from Scientific Talks

Oranits: Mission Assignment and Task Offloading in Open RAN-based ITS using Metaheuristic and Deep Reinforcement Learning

Handoff Design in User-Centric Cell-Free Massive MIMO Networks Using DRL

Advancing Compositional LLM Reasoning with Structured Task Relations in Interactive Multimodal Communications

Large Language Models for Wireless Communications: From Adaptation to Autonomy

MPCC: A Novel Benchmark for Multimodal Planning with Complex Constraints in Multimodal Large Language Models

Built with on top of