Theoretical Computer Science Spring School: Machine Learning / Ecole de Printemps d'Informatique Théorique : Apprentissage Automatique

Collection Theoretical Computer Science Spring School: Machine Learning / Ecole de Printemps d'Informatique Théorique : Apprentissage Automatique

Organisateur(s) Cappé, Olivier ; Garivier, Aurélien ; Gribonval, Rémi ; Kaufmann, Emilie ; Vernade, Claire
Date(s) 23/05/2022 - 27/05/2022
URL associée https://conferences.cirm-math.fr/2542.html
00:00:00 / 00:00:00
4 5

Privacy in machine learning

De Rachel Cummings

Privacy concerns are becoming a major obstacle to using data in the way that we want. It's often unclear how current regulations should translate into technology, and the changing legal landscape surrounding privacy can cause valuable data to go unused. How can data scientists make use of potentially sensitive data, while providing rigorous privacy guarantees to the individuals who provided data? A growing literature on differential privacy has emerged in the last decade to address some of these concerns. Differential privacy is a parameterized notion of database privacy that gives a mathematically rigorous worst-case bound on the maximum amount of information that can be learned about any one individual's data from the output of a computation. Differential privacy ensures that if a single entry in the database were to be changed, then the algorithm would still have approximately the same distribution over outputs. In this talk, we will see the definition and properties of differential privacy; survey a theoretical toolbox of differentially private algorithms that come with a strong accuracy guarantee; and discuss recent applications of differential privacy in major technology companies and government organizations.

Informations sur la vidéo

Données de citation

  • DOI 10.24350/CIRM.V.19921503
  • Citer cette vidéo Cummings, Rachel (24/05/2022). Privacy in machine learning. CIRM. Audiovisual resource. DOI: 10.24350/CIRM.V.19921503
  • URL https://dx.doi.org/10.24350/CIRM.V.19921503

Bibliographie

  • Dwork, C., McSherry, F., Nissim, K., Smith, A. (2006). Calibrating Noise to Sensitivity in Private Data Analysis. In: Halevi, S., Rabin, T. (eds) Theory of Cryptography. TCC 2006. Lecture Notes in Computer Science, vol 3876. Springer, Berlin, Heidelberg - https://doi.org/10.1007/11681878_14
  • Cynthia Dwork and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy. Found. Trends Theor. Comput. Sci. 9, 3–4 (August 2014), 211–407 - https://doi.org/10.1561/0400000042
  • F. McSherry and K. Talwar, "Mechanism Design via Differential Privacy," 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07), 2007, pp. 94-103 - https://doi.org/10.1109/FOCS.2007.66

Dernières questions liées sur MathOverflow

Pour poser une question, votre compte Carmin.tv doit être connecté à mathoverflow

Poser une question sur MathOverflow




Inscrivez-vous

  • Mettez des vidéos en favori
  • Ajoutez des vidéos à regarder plus tard &
    conservez votre historique de consultation
  • Commentez avec la communauté
    scientifique
  • Recevez des notifications de mise à jour
    de vos sujets favoris
Donner son avis