Two new research papers explore optimized subsampling techniques for Differentially Private Stochastic Gradient Descent (DP-SGD). The first paper, focusing on random shuffling, provides tight upper and lower bounds within the f-DP framework, achieving near-ideal privacy with a high number of training rounds. The second paper introduces Balanced Iteration Subsampling (BIS), demonstrating that structured participation, rather than random sampling, leads to stronger privacy amplification and optimal trade-offs across noise spectrums. Evaluations show BIS consistently outperforms Poisson subsampling in low-noise regimes, reducing the required noise multiplier. AI
Summary written by gemini-2.5-flash-lite from 3 sources. How we write summaries →
IMPACT These studies offer new methods for differentially private machine learning, potentially enabling higher utility models with stronger privacy guarantees.
RANK_REASON Two academic papers published on arXiv presenting novel theoretical and empirical results for DP-SGD subsampling techniques.