Differential Privacy - A General Introduction
It is no secret that online companies, hospitals, credit-card companies and governments hold massive datasets composed of our personal and private details. Information from such datasets is often released using some privacy preserving heuristics, which have been repeatedly shown to fail. Moreover, such heuristics (i) view privacy as a binary definition (privacy is either violated or kept), (ii) protect against one type of adversarial attack and (iii) fail to give explicit utility/privacy tradeoffs. In contrast, differentially private algorithms release information about a given dataset while satisfying a powerful and mathematical guarantee of privacy and do the opposite of all three.
In this talk we will be define differential privacy and illustrate the variety of properties that are derived immediately from its definition. We will also mention a few basic algorithms that preserve differential privacy. Finally, we will discuss further implications of differential privacy in game theory and preventing false discovery --- implications that should interest a broad audience, including those who do not care about protecting privacy.
This talk is self-contained and assumes no prior knowledge. Furthermore, the talk, albeit using mathematical concepts, is aimed at and intended for a broad audience. We therefore invite people from all disciplines to attend the talk and participate in its discussions.