The field of differential privacy and quantization is rapidly evolving, with a focus on developing innovative methods to protect sensitive data while maintaining model accuracy. Recent research has explored new approaches to quantization-aware training, including progressive element-wise gradient estimation and flatness-oriented quantization. Additionally, there have been significant advancements in differential privacy, including the development of network-aware differential privacy, metric embedding initialization-based differentially private graph clustering, and personalized w-event privacy protection. These advancements have the potential to significantly improve the privacy and accuracy of machine learning models, and are expected to have a major impact on the field in the coming years. Noteworthy papers include Progressive Element-wise Gradient Estimation for Neural Network Quantization, which proposes a novel quantization method that improves model accuracy, and PLRV-O, which introduces a framework for optimizing privacy loss random variable in differentially private deep learning.
Advances in Differential Privacy and Quantization
Sources
Statistics-Friendly Confidentiality Protection for Establishment Data, with Applications to the QCEW
Rethinking Layer-wise Gaussian Noise Injection: Bridging Implicit Objectives and Privacy Budget Allocation
Beyond Ordinary Lipschitz Constraints: Differentially Private Stochastic Optimization with Tsybakov Noise Condition