Locally private learning, estimation, inference and optimality
Thursday, June 20, 2019 - 3:30pm - 4:20pm
In this talk, we investigate statistical learning in the context of local privacy models, where data must be privatized before collection. We study the fundamental tradeoffs between statistical utility and privacy by providing sharp instance-specific bounds for private estimation through development of the local minimax risk. In contrast to the previous approach based on worst case (global minimax) risk, this new approach allows us to evaluate the difficulty of the individual problem instance and delineate the possibilities for adaptation in private estimation and inference. In the first part of our result, we identify new information-theoretic lower bounds for private estimation by developing an analogue of Fisher information, which gives us a more nuanced understanding of the challenges of adaptivity and optimality under local privacy models. The second part of our result provides new optimal procedures that adaptively achieve the information lower bound that we develop, highlighting the importance of a more careful analysis of the optimal tradeoffs between statistical utility and privacy. One consequence of our result is to identify the settings where the local privacy models are viable/stringent for statistical practice.