Differential Privacy and Information-Theoretic Security

The field of differential privacy and information-theoretic security is moving towards developing more sophisticated methods for quantifying and mitigating privacy risks in data analysis and communication systems. Researchers are exploring new connections between information theory and differential privacy, such as understanding differential privacy guarantees in terms of channel properties. There is also a growing interest in developing privacy-preserving algorithms and mechanisms for specific applications, including machine learning and distributed estimation. Noteworthy papers in this area include: Quantifying Information Disclosure During Gradient Descent Using Gradient Uniqueness, which presents a principled disclosure metric for privacy auditing in machine learning. N-output Mechanism: Estimating Statistical Information from Numerical Data under Local Differential Privacy, which proposes a generalized framework for constructing optimal mechanisms for arbitrary output sizes. Local Information-Theoretic Security via Euclidean Geometry, which introduces a methodology for investigating local properties of secure communication over discrete memoryless wiretap channels.

Sources

An information theorist's tour of differential privacy

Quantifying Information Disclosure During Gradient Descent Using Gradient Uniqueness

N-output Mechanism: Estimating Statistical Information from Numerical Data under Local Differential Privacy

Phase Transitions of the Additive Uniform Noise Channel with Peak Amplitude and Cost Constraint

Privacy-Preserving Distributed Estimation with Limited Data Rate

Local Information-Theoretic Security via Euclidean Geometry

Task-Based Quantization for Channel Estimation in RIS Empowered MmWave Systems

The Whole Is Less than the Sum of Parts: Subsystem Inconsistency in Partial Information Decomposition

Built with on top of