Hi from Vancouver
Daniel Omer, a master’s student of Dr. Or Sheffet, attended the NeurIPS conference on AI and computational neuroscience theory in Vancouver where he presented a paper on the link between differential privacy and hypotheses testing
In December 2024, Daniel Omer visited Vancouver, Canada, where he attended the NeurIPS (Neural Information Processing Systems) conference – one of the biggest conferences in the field of AI and computational neuroscience theory. Daniel presented a poster covering his joint research with Dr. Or Sheffet, studying the link between two realms: differential privacy and hypothesis testing. “Differential privacy is a mathematical way of ‘guaranteeing’ that even if I’m part of a database, no one will be determine my presence with certainty (or with near certainty), even if they use statistical ‘magic’. This is achieved by adding incremental ‘noise’ to the algorithm’s result (i.e., the database query). The noise obscures my identity while still providing a usable, though slightly inaccurate, result. In other words, the information received from the database will not change significantly if I decide to join or leave it. This is mathematics’ way of saying, ‘your data is safe, because even if you leave a fingerprint, it will be almost completely indistinct,’ he explains. “Hypothesis testing is one of the corner stones of statistics. Its goal is to decide whether a certain data pattern is the result of randomness or a reflection of a real phenomenon. In today’s world of differential privacy, we’re required to rethink traditional statistical tools.”
Daniel says that while classic statistical methods often rely on asymptotic assumptions – that is, analyses that approach accuracy as the number of samples grows to infinity – differential privacy is constraining: we must know precisely how many samples we have, and how different they may be from one another. “Research in this field is primarily focused on finite domains, that is, distributions with a collection of specific values from which distribution can choose. We’re the first to take this question a step further and explore a continuous domain, such as a range of values where every number is possible, including infinite fractions, with minimum assumptions on that distribution. The solution required creativity and precise analysis to balance privacy requirements and the ability to produce significant statistical insights,” he shares.
Daniel, 26 from Or Yehuda, completed his bachelor’s in mathematics while still in high school. He is now completing his master’s in mathematics under the supervision of Dr. Or Sheffet of the computer engineering program. “I was looking for a supervisor for my thesis, which I knew for certain would involve privacy at the time I was mostly familiar with topics related to cryptography. Then I heard about an engineering course on differential privacy, signed up, and that’s when I first met Or and the fascinating world of differential privacy. I realized I wanted to dive deeper. Or opened the door for me. Our research and the questions we ask are mostly theoretical, and with theoretical work – especially at the beginning – it’s easy to get lost without proper guidance. Or makes sure we meet on a weekly basis, even if we didn’t make any significant progress, and that’s when we hone our ideas and raise new questions. It’s extremely helpful. Oftentimes, the real progress happens during those sessions.”
At the conference, Daniel presented the paper he wrote with Dr. Sheffet, Differentially Private Equivalence Testing for Continuous Distributions and Applications. “The presentation itself took three consecutive hours, but time went by quickly and I enjoyed seeing people come up, show interest and ask good questions, especially when these were researchers whose papers I’ve read and was finally able to put a face to the name. That was the moment the academic world became approachable and tangible.”
Last Updated Date : 30/01/2025