Boosting and Hard-Core Sets.
A. Klivans and R. Servedio.
Machine Learning 53(3), 2003, pp. 217-238 (special issue on Computational Learning Theory).
Preliminary version in 40th Annual Symposium on Foundations of Computer Science (FOCS), 1999, pp. 624-633.


This paper connects {\em hard-core set construction,} a type of hardness amplification from computational complexity, and {\em boosting}, a technique from computational learning theory. Using this connection we give fruitful applications of complexity-theoretic techniques to learning theory and vice versa. We show that the hard-core set construction of Impagliazzo \cite{Impagliazzo:95}, which establishes the existence of distributions under which boolean functions are highly inapproximable, may be viewed as a boosting algorithm. Using alternate boosting methods we give an improved bound for hard-core set construction which matches known lower bounds from boosting and thus is optimal within this class of techniques. We then show how to apply techniques from \cite{Impagliazzo:95} to give a new version of Jackson's celebrated Harmonic Sieve algorithm for learning DNF formulae under the uniform distribution using membership queries. Our new version has a significant asymptotic improvement in running time. Critical to our arguments is a careful analysis of the distributions which are employed in both boosting and hard-core set constructions.

Postscript or pdf of conference version

pdf of journal version

Back to main papers page