PulseAugur
LIVE 10:38:52
research · [3 sources] ·
0
research

New DP-SGD subsampling methods offer improved privacy-utility trade-offs

Two new research papers explore optimized subsampling techniques for Differentially Private Stochastic Gradient Descent (DP-SGD). The first paper, focusing on random shuffling, provides tight upper and lower bounds within the f-DP framework, achieving near-ideal privacy with a high number of training rounds. The second paper introduces Balanced Iteration Subsampling (BIS), demonstrating that structured participation, rather than random sampling, leads to stronger privacy amplification and optimal trade-offs across noise spectrums. Evaluations show BIS consistently outperforms Poisson subsampling in low-noise regimes, reducing the required noise multiplier. AI

Summary written by gemini-2.5-flash-lite from 3 sources. How we write summaries →

IMPACT These studies offer new methods for differentially private machine learning, potentially enabling higher utility models with stronger privacy guarantees.

RANK_REASON Two academic papers published on arXiv presenting novel theoretical and empirical results for DP-SGD subsampling techniques.

Read on arXiv cs.LG →

COVERAGE [3]

  1. arXiv cs.LG TIER_1 · Marten van Dijk, Murat Bilgehan Ertan ·

    Trade-off Functions for DP-SGD with Subsampling based on Random Shuffling: Tight Upper and Lower Bounds

    arXiv:2605.06259v1 Announce Type: new Abstract: We derive a tight analysis of the trade-off function for Differentially Private Stochastic Gradient Descent (DP-SGD) with subsampling based on random shuffling within the $f$-DP framework. Our analysis covers the regime $\sigma \geq…

  2. arXiv stat.ML TIER_1 · Andy Dong, Ayfer \"Ozg\"ur ·

    Less Random, More Private: What is the Optimal Subsampling Scheme for DP-SGD?

    arXiv:2605.07072v1 Announce Type: cross Abstract: Poisson subsampling is the default sampling scheme in differentially private machine learning, largely because its unstructured randomness yields tractable privacy amplification analyses. Yet this same randomness introduces substa…

  3. arXiv stat.ML TIER_1 · Ayfer Özgür ·

    Less Random, More Private: What is the Optimal Subsampling Scheme for DP-SGD?

    Poisson subsampling is the default sampling scheme in differentially private machine learning, largely because its unstructured randomness yields tractable privacy amplification analyses. Yet this same randomness introduces substantial participation variance: each sample appears …