×
Well done. You've clicked the tower. This would actually achieve something if you had logged in first. Use the key for that. The name takes you home. This is where all the applicables sit. And you can't apply any changes to my site unless you are logged in.

Our policy is best summarized as "we don't care about _you_, we care about _them_", no emails, so no forgetting your password. You have no rights. It's like you don't even exist. If you publish material, I reserve the right to remove it, or use it myself.

Don't impersonate. Don't name someone involuntarily. You can lose everything if you cross the line, and no, I won't cancel your automatic payments first, so you'll have to do it the hard way. See how serious this sounds? That's how serious you're meant to take these.

×
Register


Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.
  • Your password can’t be too similar to your other personal information.
  • Your password must contain at least 8 characters.
  • Your password can’t be a commonly used password.
  • Your password can’t be entirely numeric.

Enter the same password as before, for verification.
Login

Grow A Dic
Define A Word
Make Space
Set Task
Mark Post
Apply Votestyle
Create Votes
(From: saved spaces)
Exclude Votes
Apply Dic
Exclude Dic

Click here to flash read.

We study the fundamental problem of the construction of optimal randomization
in Differential Privacy. Depending on the clipping strategy or additional
properties of the processing function, the corresponding sensitivity set
theoretically determines the necessary randomization to produce the required
security parameters. Towards the optimal utility-privacy tradeoff, finding the
minimal perturbation for properly-selected sensitivity sets stands as a central
problem in DP research. In practice, l_2/l_1-norm clippings with
Gaussian/Laplace noise mechanisms are among the most common setups. However,
they also suffer from the curse of dimensionality. For more generic clipping
strategies, the understanding of the optimal noise for a high-dimensional
sensitivity set remains limited.


In this paper, we revisit the geometry of high-dimensional sensitivity sets
and present a series of results to characterize the non-asymptotically optimal
Gaussian noise for R\'enyi DP (RDP). Our results are both negative and
positive: on one hand, we show the curse of dimensionality is tight for a broad
class of sensitivity sets satisfying certain symmetry properties; but if,
fortunately, the representation of the sensitivity set is asymmetric on some
group of orthogonal bases, we show the optimal noise bounds need not be
explicitly dependent on either dimension or rank. We also revisit sampling in
the high-dimensional scenario, which is the key for both privacy amplification
and computation efficiency in large-scale data processing. We propose a novel
method, termed twice sampling, which implements both sample-wise and
coordinate-wise sampling, to enable Gaussian noises to fit the sensitivity
geometry more closely. With closed-form RDP analysis, we prove twice sampling
produces asymptotic improvement of the privacy amplification given an
additional infinity-norm restriction, especially for small sampling rate.

Click here to read this post out
ID: 382429; Unique Viewers: 0
Unique Voters: 0
Total Votes: 0
Votes:
Latest Change: Sept. 7, 2023, 7:32 a.m. Changes:
Dictionaries:
Words:
Spaces:
Views: 7
CC:
No creative common's license
Comments: