Differential Privacy in Machine Learning

The field of machine learning is moving towards a greater emphasis on differential privacy, with a focus on developing algorithms and models that can balance privacy and utility. Recent research has explored the use of noisy stochastic gradient descent, differential privacy mechanisms, and hybrid architectures to protect sensitive information while maintaining model performance. Notable advancements include the development of methods for linear regression and synthetic data generation with statistical guarantees, as well as the creation of hybrid models that fuse different architectures for robust inertial navigation.

Noteworthy papers include: Differentially Private Linear Regression and Synthetic Data Generation with Statistical Guarantees, which proposes a method for linear regression with valid inference under Gaussian differential privacy. ConvXformer: Differentially Private Hybrid ConvNeXt-Transformer for Inertial Navigation, which introduces a hybrid architecture that fuses ConvNeXt blocks with Transformer encoders for robust inertial navigation while ensuring differential privacy guarantees.

Sources

High-Dimensional Privacy-Utility Dynamics of Noisy Stochastic Gradient Descent on Least Squares

Differentially Private Linear Regression and Synthetic Data Generation with Statistical Guarantees

ConvXformer: Differentially Private Hybrid ConvNeXt-Transformer for Inertial Navigation

Enabling Granular Subgroup Level Model Evaluations by Generating Synthetic Medical Time Series

Built with on top of