Fri 20 Nov 2020 21:20 - 21:40 at SPLASH-III - F-2B Chair(s): Steve Blackburn, Alex Potanin
Applying differential privacy at scale requires convenient ways to check that
programs
computing with sensitive data appropriately preserve privacy. We propose here a
fully automated framework for {\em testing} differential privacy, adapting a well-known
``pointwise'' technique from informal proofs of differential privacy. Our
framework, called DPCheck,
requires no programmer annotations, handles all previously verified or
tested
algorithms, and is the first fully automated framework to distinguish correct and buggy implementations of PrivTree, a
probabilistically terminating algorithm that has not previously been
mechanically checked.
We analyze the probability of DPCheck mistakenly accepting a non-private
program and prove that, theoretically, the probability of false acceptance can
be made exponentially small by suitable choice of test size.
We demonstrate DPCheck's utility empirically by implementing all benchmark
algorithms from prior work on mechanical verification of differential privacy,
plus several others and their incorrect variants, and show DPCheck accepts the
correct implementations and rejects the incorrect variants.
We also demonstrate how DPCheck can be deployed in a practical workflow to test
differentially privacy for the 2020 US Census Disclosure Avoidance System (DAS).