-
Differential privacy for machine learning models can be obtained in four ways: input perturbation, output perturbation, objective perturbation, and change in optimization algorithm. The fourth method modifies the optimization algorithm for training machine learning models. This includes noisy SGD methods, which we discuss in the next section.
-
The analysis of such algorithms can be broken into two parts:
- Obtain ( ε ′ \varepsilon' ε′, δ ′ \delta' δ′) differential privacy for each round of SGD, by ensuring that any information from the dataset that is used to update the model parameters is differentially-private.
- Compute the total privacy cost of all SGD iterations to obtain overall (
ε
\varepsilon
ε,
δ
\delta
δ) parameters.
-
keep track of accumulated privacy loss over multiple iterations of SGD 的几种方法 :
- Privacy accountant \, [32]
- Moments accountant (for the Gaussian mechanism) \, [14]
- Adaptive strategies to select privacy parameters
ε
′
\varepsilon'
ε′ and
δ
′
\delta'
δ′
\,
[37,38]
-
bound the sensitivity of the gradients at each round of SGD 的几种方法 :
- bound the gradient norm by the Lipschitz bound \, [16]
- clip each coordinate of the stochastic gradient vector to the range [−C,C] \, [13]
- bounds the l 2 l_2 l2-norm of the stochastic gradient by clipping the gradient l 2 l_2 l2-norm to a threshold C \, [14]
- adaptive strategies to select the
l
2
l_2
l2-norm threshold C
\,
[40,41]
-
AdaCliP
-
本文的主要创新点
- 以前的算法:added noise to the gradients themselves.
- 本算法:transform the gradient by a function, add noise, and apply the inverse of the function back.
Note that choices of a t = ( 0 , 0 , . . . , 0 ) a^t = (0, 0, ..., 0) at=(0,0,...,0) and b t = ( C , C , . . . C ) b^t = (C,C, ...C) bt=(C,C,...C) result in the algorithm of [14]. - 改进后的优点:reduce the variance and bias of differentially private gradients and by Lemma 2 yield a better solution.
- What are the optimal choices of a t a^t at and b t b^t\, bt?
- Convergence analysis
Ref
Pichapati, V., Suresh, A. T., Yu, F. X., Reddi, S. J., & Kumar, S. (2019). AdaCliP: Adaptive clipping for private SGD. arXiv preprint arXiv:1908.07643.