The wide deployment of machine learning in recent years gives rise to a great
demand for large-scale and high-dimensional data, for which the privacy raises
serious concern. Differential privacy (DP) mechanisms are conventionally
developed for scalar values, not for structural data like matrices. Our work
proposes Improved Matrix Gaussian Mechanism (IMGM) for matrix-valued DP, based
on the necessary and sufficient condition of $ (varepsilon,delta)
$-differential privacy. IMGM only imposes constraints on the singular values of
the covariance matrices of the noise, which leaves room for design. Among the
legitimate noise distributions for matrix-valued DP, we find the optimal one
turns out to be i.i.d. Gaussian noise, and the DP constraint becomes a noise
lower bound on each element. We further derive a tight composition method for
IMGM. Apart from the theoretical analysis, experiments on a variety of models
and datasets also verify that IMGM yields much higher utility than the
state-of-the-art mechanisms at the same privacy guarantee.

By admin