Advances in Differential Privacy and Quantization

The field of differential privacy and quantization is rapidly evolving, with a focus on developing innovative methods to protect sensitive data while maintaining model accuracy. Recent research has explored new approaches to quantization-aware training, including progressive element-wise gradient estimation and flatness-oriented quantization. Additionally, there have been significant advancements in differential privacy, including the development of network-aware differential privacy, metric embedding initialization-based differentially private graph clustering, and personalized w-event privacy protection. These advancements have the potential to significantly improve the privacy and accuracy of machine learning models, and are expected to have a major impact on the field in the coming years. Noteworthy papers include Progressive Element-wise Gradient Estimation for Neural Network Quantization, which proposes a novel quantization method that improves model accuracy, and PLRV-O, which introduces a framework for optimizing privacy loss random variable in differentially private deep learning.

Sources

Progressive Element-wise Gradient Estimation for Neural Network Quantization

Quantization Meets OOD: Generalizable Quantization-aware Training from a Flatness Perspective

Statistics-Friendly Confidentiality Protection for Establishment Data, with Applications to the QCEW

Augmented Shuffle Differential Privacy Protocols for Large-Domain Categorical and Key-Value Data

Reusing Samples in Variance Reduction

Managing Correlations in Data and Privacy Demand

A Comprehensive Guide to Differential Privacy: From Theory to User Expectations

DPQuant: Efficient and Differentially-Private Model Training via Dynamic Quantization Scheduling

Rethinking Layer-wise Gaussian Noise Injection: Bridging Implicit Objectives and Privacy Budget Allocation

An Interactive Framework for Finding the Optimal Trade-off in Differential Privacy

Beyond Ordinary Lipschitz Constraints: Differentially Private Stochastic Optimization with Tsybakov Noise Condition

Network-Aware Differential Privacy

Metric Embedding Initialization-Based Differentially Private and Explainable Graph Clustering

PLRV-O: Advancing Differentially Private Deep Learning via Privacy Loss Random Variable Optimization

Verifying Sampling Algorithms via Distributional Invariants

Infinite Stream Estimation under Personalized $w$-Event Privacy

Tight Privacy Audit in One Run

PAnDA: Rethinking Metric Differential Privacy Optimization at Scale with Anchor-Based Approximation

Approximate Algorithms for Verifying Differential Privacy with Gaussian Distributions

Built with on top of