Differential Privacy in Machine Learning

The field of machine learning is moving towards incorporating differential privacy to safeguard sensitive data. Researchers are exploring various aspects of differential privacy, including its application to clustering problems, support vector machines, and hyperparameter tuning. The development of new algorithms and mechanisms, such as those using correlated noises, is improving the accuracy and efficiency of differentially private training. Notably, the use of differential privacy in federated learning scenarios is also gaining attention, with methods being proposed for approximating ROC and PR curves under distributed differential privacy.

Some noteworthy papers in this area include: The paper on Differentially Private Wasserstein Barycenters, which presents the first algorithms for computing Wasserstein barycenters under differential privacy. The paper on DP-HYPE, which performs a distributed and privacy-preserving hyperparameter search by conducting a distributed voting based on local hyperparameter evaluations of clients.

Sources

The Computational Complexity of Almost Stable Clustering with Penalties

Differentially Private Wasserstein Barycenters

Multi-Class Support Vector Machine with Differential Privacy

Busemann Functions in the Wasserstein Space: Existence, Closed-Forms, and Applications to Slicing

Federated Computation of ROC and PR Curves

DP-HYPE: Distributed Differentially Private Hyperparameter Search

Spectral Graph Clustering under Differential Privacy: Balancing Privacy, Accuracy, and Efficiency

Cocoon: A System Architecture for Differentially Private Training with Correlated Noises

Built with on top of