Differential Privacy Advancements

The field of differential privacy is moving towards more robust and efficient methods for protecting user data. Recent developments have focused on improving the accuracy and scalability of differential privacy mechanisms, particularly in the context of local differential privacy (LDP) and multi-user environments. Notably, new auditing frameworks and algorithms have been proposed to quantify and mitigate correlation-induced privacy leakage in LDP mechanisms. Additionally, there is a growing interest in developing more reliable and generalizable differentially private machine learning techniques, with a focus on reproducibility and replicability. Some papers have also explored the application of differential privacy in specific domains, such as smart metering and wireless body area networks. Overall, the field is advancing towards more practical and effective solutions for privacy-preserving data analysis and machine learning. Noteworthy papers include: KV-Auditor, which proposes a framework for auditing LDP-based key-value estimation mechanisms, and Stabilization of Perturbed Loss Function, which introduces a differentially private training mechanism for multi-user LDP.

Sources

KV-Auditor: Auditing Local Differential Privacy for Correlated Key-Value Estimation

The Hidden Cost of Correlation: Rethinking Privacy Leakage in Local Differential Privacy

Differentially Private aggregate hints in mev-share

Understanding Data Influence with Differential Approximation

A Collusion-Resistance Privacy-Preserving Smart Metering Protocol for Operational Utility

A Lightweight Privacy-Preserving Smart Metering Billing Protocol with Dynamic Tariff Policy Adjustment

Tighter Privacy Analysis for Truncated Poisson Sampling

Towards Reliable and Generalizable Differentially Private Machine Learning (Extended Version)

Locally Differentially Private Multi-Sensor Fusion Estimation With System Intrinsic Randomness

Private Hyperparameter Tuning with Ex-Post Guarantee

Stabilization of Perturbed Loss Function: Differential Privacy without Gradient Noise

Built with on top of