\mathbb{E}[\lVert \mu-\hat{\mu}(Y^n) \rVert_2^2] = \Omega(\frac{d^2}{\varepsilon^2\ell})
I(Z\land Y^n)?
# Cookbook: Lower Bounds for Statistical Inference in Distributed and Constrained Settings

## FOCS 2020 Tutorial: Friday, November 13 (Virtually Everywhere)

The goal of this tutorial would be to provide the attendees with an overview of techniques and recipes for distributed learning and testing under constraints. Over the recent years, many papers have obtained both upper and lower bounds for statistical estimation under communication, local privacy, and memory constraints (see, e.g., the bibliography): these questions are motivated by applications in machine learning and distributed computing, and are at the intersection of theoretical computer science, machine learning, statistics, and information theory.

This aims to provide a primer of those techniques, aiming to provide "plug-and-play" general recipes the attendees could then apply to the problems of their choice. Our focus will be on establishing lower bounds for statistical estimation, in particular for parameter estimation (single- and high-dimensional) and testing.

Besides the 4 recorded parts of the tutorial below, the tutorial will feature a live session during the FOCS conference, for discussion and Q&A. This will take place on **November 13th, 3-4pm ET**.

**Part I**Clément Canonne**Part II**Jayadev Acharya**Part III**Himanshu Tyagi**Part IV**Clément Canonne

This first lecture introduces the type of questions considered, provides examples of what we mean by "constraints" and how they are modeled, and defines the various settings (distributed, interactive or not) discussed in the other parts of the tutorial.

[video (YouTube)], [slides (PPTX)], [slides (PDF)]This part of the tutorial covers three methods to establish lower bounds for information-constrained learning/estimation of distributions under general \ell_p loss functions. The methods are (based on) the classic Cramér—Rao method, strong data processing, and chi-squared contractions.

[video (YouTube)], [slides (PPTX)], [slides (PDF)]In this part, we consider hypothesis testing under information constraints and present two general lower bounds: The decoupled chi-square contraction bound and the average information bound. We conclude with discussion on the high-dimensional mean testing problem, where only partial results are known.

[video (YouTube)], [slides (PPTX)], [slides (PDF)]The last part of the tutorial concludes the series by looking at *upper bounds*, that is, algorithms or protocols for the estimation or testing tasks discussed in the previous lectures whose performance matches the lower bounds we discussed. Finally, a couple open questions are mentioned.

The annotated bibliography for this tutorial can be found here.

- Jayadev Acharya is an assistant professor in the School of Electrical and Computer Engineering at Cornell University. He received the Bachelor of Technology degree in Electronics and Electrical Communication Engineering from the Indian Institute of Technology, Kharagpur in 2007, and M.S. (2009) and Ph.D degree (2014) in Electrical and Computer Engineering from the University of California, San Diego. He was a postdoctoral associate in Electrical Engineering and Computer Science at Massachusetts Institute of Technology from 2014 to 2016.
- Clément Canonne is an incoming Lecturer at the School of Computer Science of the University of Sydney, Australia. Prior to this, he was a Goldstine Postdoctoral Fellow at IBM Research, and a Motwani Postdoctoral Fellow at Stanford University, after graduating from Columbia University in 2017, where he was advised by Rocco Servedio. His research focuses on the fields of property testing and sublinear algorithms, and more broadly on computational aspects of learning and statistical inference.
- Himanshu Tyagi received the B.Tech. degree in electrical engineering and the M.Tech. degree in communication and information technology, both from the Indian Institute of Technology, Delhi, India in 2007. He received the Ph.D. degree from the University of Maryland, College Park in 2013. From 2013 to 2014, he was a postdoctoral researcher at the Information Theory and Applications (ITA) Center, University of California, San Diego. Since January 2015, he has been a faculty member at the Department of Electrical Communication Engineering, Indian Institute of Science in Bangalore. His research interests broadly lie in information theory and its application in cryptography, statistics, machine learning, and computer science. Also, he is interested in communication and automation for city-scale systems.