Dr. Franziska Boenisch (CISPA) will give a talk on »Privacy-Preserving Machine Learning in the Era of Large Foundation Models«
News from Jun 20, 2025
On July 1, 2025, at 14:15, Dr. Franziska Boenisch (CISPA) will give a talk on »Privacy-Preserving Machine Learning in the Era of Large Foundation Models«
When: July 1, 2025, 14:15 Where: SR049Dr. Franziska Boenisch is a tenure-track faculty member at the CISPA Helmholtz Center for Information Security, where she co-leads the SprintML lab. Before, she was a Postdoctoral Fellow at the University of Toronto and Vector Institute advised by Prof. Nicolas Papernot. Her current research centers around private and trustworthy machine learning. Franziska obtained her Ph.D. at the Computer Science Department at Freie Universität Berlin, where she pioneered the notion of individualized
privacy in machine learning. During her Ph.D., Franziska was a research associate at the Fraunhofer Institute for Applied and Integrated Security (AISEC), Germany. She received a Fraunhofer TALENTA grant for outstanding female early career researchers, the German Industrial Research Foundation prize for her research on machine learning privacy,
the Fraunhofer ICT Dissertation Award 2023, the Academics Nachwuchspreis 2025, and was named a GI-Junior Fellow in 2024.
Abstract
As machine learning permeates nearly all aspects of society, protecting the privacy of individuals whose data power these models is more crucial than ever. In this talk, I will provide a comprehensive overview of differential privacy (DP) — the leading framework for mathematically rigorous privacy guarantees — and discuss algorithms for implementing it in practice for machine learning. We will explore the benefits and drawbacks of applying DP in machine learning, shedding light on the utility-privacy trade-offs and the technical, operational, and societal challenges associated with it. Finally, we will consider the future of DP in the era of large foundation models, addressing questions about scaling, adaptability, and the unique risks and opportunities these powerful models present for protecting individual privacy.