\frac{2}{\pi}\displaystyle \int_{-1}^1 U_n(x) U_m(x) \sqrt{1-x^2} dx = \delta_{n,m}
T_{n+1}(x) = 2x T_n(x) - T_{n-1}(x)
# (Some) Orthogonal Polynomials and their Applications to TCS

## FOCS 2016 Workshop: Saturday, October 8 (New Brunswick)

**2:30-2:40***Introductory remarks***2:40-3:30**Nisheeth Vishnoi**3:40-4:30**Justin Thaler*Coffee break***5:10-6:00**Paul Valiant

Orthogonal polynomials play a central role in the area of approximation theory which in turn has played an important role in the development of fast algorithms. Low degree approximations to fundamental real valued functions allow us to speed up the computation of corresponding matrix-valued functions. The ability to compute such functions quickly underlies various modern spectral algorithms. In this talk, I will present this interplay between polynomials and algorithms.

Part of the talk will be based on a monograph with Sushant Sachdeva.[slides] [video]

The \varepsilon-approximate degree of a Boolean function is the minimum degree of a real polynomial that pointwise approximates f to error \varepsilon. Approximate degree has wide-ranging applications in theoretical computer science, from computational learning theory to communication complexity, circuit complexity, oracle separations, and even cryptography.

This talk will survey what is known about approximate degree and its many applications. I will start by describing optimal approximations for several fundamental classes of functions — these approximations underly many state of the art algorithms, and are often based on Chebyshev polynomials. I will then describe recent progress towards showing that these approximations are essentially optimal; these lower bounds have enabled striking progress on longstanding open problems, especially in communication complexity.

[slides] [video]Illustrated by examples from my work on statistical property estimation, we will explore the three main families of orthogonal polynomials: Chebyshev, Laguerre, and Hermite. These three classes of polynomials mimic aspects of rather different types of functions: Chebyshev polynomials mimic sines and cosines; Laguerre polynomials add aspects of exponential functions; and Hermite polynomials inherit properties from Gaussians, including being well-behaved under Fourier transforms. We explore how these perspectives provide a general toolkit for flexible polynomial constructions, showing both algorithmic upper and lower bounds derived from these techniques.

[slides] [video]- Justin Thaler received his PhD at Harvard University under the supervision of Michael Mitzenmacher, and is currently an Assistant Professor at Georgetown University. His research focuses on algorithms for massive data sets, verifiable computation, and computational learning theory.
- Paul Valiant received his PhD at MIT under the supervision of Silvio Micali, and is currently an Assistant Professor at Brown University. His research focuses on the theory behind big data, scientific computing, and evolution.
- Nisheeth Vishnoi received his PhD at Georgia Tech under the supervision of Richard Lipton, and is currently a Professor at EPFL. His research focuses on algorithms, complexity, and optimization, including on how computation can be used to gain insight into processes in nature and society.