Differentially private analysis of graphs

From Wikipedia, the free encyclopedia

Differentially private analysis of graphs[1] studies algorithms for computing accurate graph statistics while preserving differential privacy. Such algorithms are used for data represented in the form of a graph where nodes correspond to individuals and edges correspond to relationships between them. For examples, edges could correspond to friendships, sexual relationships, or communication patterns. A party that collected sensitive graph data can process it using a differentially private algorithm and publish the output of the algorithm. The goal of differentially private analysis of graphs is to design algorithms that compute accurate global information about graphs while preserving privacy of individuals whose data is stored in the graph.

Variants[edit]

Differential privacy imposes a restriction on the algorithm. Intuitively, it requires that the algorithm has roughly the same output distribution on neighboring inputs. If the input is a graph, there are two natural notions of neighboring inputs, edge neighbors and node neighbors, which yield two natural variants of differential privacy for graph data.

Let ε be a positive real number and be a randomized algorithm that takes a graph as input and returns an output from a set . The algorithm is -differentially private if, for all neighboring graphs and and all subsets of ,

where the probability is taken over the randomness used by the algorithm.

Edge differential privacy[edit]

Two graphs are edge neighbors if they differ in one edge. An algorithm is -edge-differentially private if, in the definition above, the notion of edge neighbors is used. Intuitively, an edge differentially private algorithm has similar output distributions on any pair of graphs that differ in one edge, thus protecting changes to graph edges.

Node differential privacy[edit]

Two graphs are node neighbors if one can be obtained from the other by deleting a node and its adjacent edges. An algorithm is -node-differentially private if, in the definition above, the notion of node neighbors is used. Intuitively, a node differentially private algorithm has similar output distributions on any pair of graphs that differ in one one nodes and edges adjacent to it, thus protecting information pertaining to each individual. Node differential privacy give a stronger privacy protection than edge differential privacy.

Research history[edit]

The first edge differentially private algorithm was designed by Nissim, Raskhodnikova, and Smith.[2] The distinction between edge and node differential privacy was first discussed by Hay, Miklau, and Jensen.[3] However, it took several years before first node differentially private algorithms were published in Blocki et al.,[4] Kasiviswanathan et al.,[5] and Chen and Zhou.[6] In all three papers, the algorithms are for releasing a single statistic, like a triangle count or counts of other subgraphs. Raskhodnikova and Smith gave the first node differentially private algorithm for releasing a vector, specifically, the degree count and the degree distribution.[7]

References[edit]

  1. ^ Sofya Raskhodnikova; Adam Smith (2015). "Differentially Private Analysis of Graphs". Kao MY. (Eds) Encyclopedia of Algorithms. Springer, Berlin, Heidelberg. doi:10.1007/978-3-642-27848-8_549-1.
  2. ^ Nissim, Kobbi; Raskhodnikova, Sofya; Smith, Adam (2007). "Smooth sensitivity and sampling in private data analysis". Proceedings of the thirty-ninth annual ACM symposium on Theory of computing. New York, New York, USA: ACM Press. pp. 75–84. doi:10.1145/1250790.1250803. ISBN 9781595936318. S2CID 5642529.
  3. ^ Hay, Michael; Li, Chao; Miklau, Gerome; Jensen, David (2009). "Accurate Estimation of the Degree Distribution of Private Networks". 2009 Ninth IEEE International Conference on Data Mining. IEEE. pp. 169–178. doi:10.1109/icdm.2009.11. ISBN 9781424452422. S2CID 2572996.
  4. ^ Blocki, Jeremiah; Blum, Avrim; Datta, Anupam; Sheffet, Or (2012). "The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy". 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science. pp. 410–419. arXiv:1204.2136. Bibcode:2012arXiv1204.2136B. doi:10.1109/focs.2012.67. ISBN 978-0-7695-4874-6. S2CID 349368.
  5. ^ Kasiviswanathan, Shiva Prasad; Nissim, Kobbi; Raskhodnikova, Sofya; Smith, Adam (2013), "Analyzing Graphs with Node Differential Privacy", Theory of Cryptography, Springer Berlin Heidelberg, pp. 457–476, doi:10.1007/978-3-642-36594-2_26, ISBN 9783642365935
  6. ^ Chen, Shixi; Zhou, Shuigeng (2013). "Recursive mechanism". Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data. New York, New York, USA: ACM Press. pp. 653–664. doi:10.1145/2463676.2465304. ISBN 9781450320375. S2CID 16257197.
  7. ^ Raskhodnikova, Sofya; Smith, Adam (2016). "Lipschitz Extensions for Node-Private Graph Statistics and the Generalized Exponential Mechanism". 2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS). IEEE. pp. 495–504. doi:10.1109/focs.2016.60. ISBN 9781509039333. S2CID 7310416.