Advances in Edge-Cloud Computing

The field of edge-cloud computing is moving towards a more distributed and hybrid approach, with a focus on privacy, latency, and cost efficiency. Researchers are exploring novel architectures and techniques, such as edge AI, fog computing, and orchestration of domain-specific language models, to optimize performance and reduce costs. Notable innovations include the development of on-device assistants, edge-cloud orchestrators, and lightweight language models, which are advancing the field and enabling more efficient and secure computing. Some papers are particularly noteworthy, including: The introduction of HomeLLaMA, which provides privacy-preserving and personalized smart home services with a tailored small language model. The development of ECO-LLM, a novel system that optimizes edge-cloud collaboration for large language models, reducing costs and latency while improving accuracy. The introduction of LiLM-RDB-SFC, a lightweight language model with relational database-guided DRL for optimized SFC provisioning, which demonstrates improved performance and efficiency.

Sources

Towards Privacy-Preserving and Personalized Smart Homes via Tailored Small Language Models

Orchestration for Domain-specific Edge-Cloud Language Models

Analysis of AI Techniques for Orchestrating Edge-Cloud Application Migration

Leveraging RAG-LLMs for Urban Mobility Simulation and Analysis

LiLM-RDB-SFC: Lightweight Language Model with Relational Database-Guided DRL for Optimized SFC Provisioning

The AI Shadow War: SaaS vs. Edge Computing Architectures

MOFCO: Mobility- and Migration-Aware Task Offloading in Three-Layer Fog Computing Environments

CRAFT: Latency and Cost-Aware Genetic-Based Framework for Node Placement in Edge-Fog Environments

Built with on top of