A. Klivans and R. O'Donnell and R. Servedio.

To appear in

We study the learnability of sets in $\R^n$ under the Gaussian distribution, taking Gaussian \emph{surface area} as the ``complexity measure'' of the sets being learned. Let $\calC_S$ denote the class of all (measurable) sets with surface area at most $S$. We first show that the class $\calC_S$ is learnable to any constant accuracy in time $n^{O(S^2)}$, even in the arbitrary noise (``agnostic'') model. Complementing this, we also show that any learning algorithm for $\calC_S$ information-theoretically requires $2^{\Omega(S^2)}$ examples for learning to constant accuracy. These results together show that Gaussian surface area essentially characterizes the computational complexity of learning under the Gaussian distribution.

Our approach yields several new learning results, including the following (all bounds are for learning to any constant accuracy):

- The class of {\em all} convex sets can be agnostically learned in time $2^{\tilde{O}(\sqrt{n})}$ (and we prove a $2^{\Omega(\sqrt{n})}$ lower bound for noise-free learning). This is the first subexponential time algorithm for learning general convex sets even in the noise-free (PAC) model.
- Intersections of $k$ halfspaces can be agnostically learned in time $n^{O(\log k)}$ (cf. Vempala's $n^{O(k)}$ time algorithm for learning in the noise-free model ~\cite{Vempala:04book}).
- Arbitrary cones (with apex centered at the origin), and spheres with arbitrary radius and center, can be agnostically learned in time $\poly(n)$.

Postscript or
pdf of extended version