The fields of homomorphic encryption, federated learning, communication protocols, and artificial intelligence are witnessing significant advancements, driven by the need for efficient, secure, and private computing solutions. A common theme among these areas is the focus on improving performance, fairness, and robustness in non-IID settings, as well as enabling secure and private inference.
In the area of homomorphic encryption, researchers are exploring new approaches to bootstrapping, a critical component of Fully Homomorphic Encryption (FHE) schemes. Notable papers include Bootstrapping as a Morphism, which introduces a new geometric perspective on bootstrapping, and CryptOracle, a modular framework for characterizing FHE. Additionally, there is a growing interest in applying homomorphic encryption to emerging areas such as machine learning and spiking neural networks, with papers like FHEON and PrivSpike achieving high accuracy and low latency on various CNN architectures.
In federated learning, researchers are addressing the challenges of heterogeneous data distributions, quantity skew, and time delays. Notable developments include the integration of adaptive boosting mechanisms, clustered federated learning, and personalized prototype learning. Papers like FeDABoost, CORNFLQS, and DPMM-CFL have proposed novel frameworks and approaches to improve the accuracy and reliability of models in real-world applications.
The field of communication protocols is also witnessing significant advancements, with the development of subquadratic two-party communication protocols and public-key encryption schemes from the MinRank problem. Papers like A Subquadratic Two-Party Communication Protocol for Minimum Cost Flow and Public-Key Encryption from the MinRank Problem have pushed the boundaries of secure communication. Furthermore, the design of adaptive receive scaling factors in over-the-air federated learning and the analysis of trade-offs in estimating the number of Byzantine clients are contributing to the advancement of privacy-preserving protocols.
Finally, in the area of artificial intelligence, researchers are exploring new architectures and techniques to optimize energy efficiency, such as targeted optimizations to transformer attention and MLP layers, and fine-grained empirical analysis of inference energy across core components of transformer architecture. Noteworthy papers include Litespark Technical Report, Dissecting Transformers: A CLEAR Perspective towards Green AI, FTTE: Federated Learning on Resource-Constrained Devices, Edge-FIT: Federated Instruction Tuning of Quantized LLMs for Privacy-Preserving Smart Home Environments, and CAFL-L: Constraint-Aware Federated Learning with Lagrangian Dual Optimization for On-Device Language Models.
Overall, these advancements have the potential to significantly improve the accuracy, reliability, and privacy of computing solutions in various applications, and highlight the innovative work being done in these fields.