The field of machine learning is moving towards incorporating differential privacy to safeguard sensitive data. Researchers are exploring various aspects of differential privacy, including its application to clustering problems, support vector machines, and hyperparameter tuning. The development of new algorithms and mechanisms, such as those using correlated noises, is improving the accuracy and efficiency of differentially private training. Notably, the use of differential privacy in federated learning scenarios is also gaining attention, with methods being proposed for approximating ROC and PR curves under distributed differential privacy.
Some noteworthy papers in this area include: The paper on Differentially Private Wasserstein Barycenters, which presents the first algorithms for computing Wasserstein barycenters under differential privacy. The paper on DP-HYPE, which performs a distributed and privacy-preserving hyperparameter search by conducting a distributed voting based on local hyperparameter evaluations of clients.