E. Mossel and R. O'Donnell and R. Servedio. Journal of Computer and System Sciences 69(3), 2004, pp. 421-434 (special issue for STOC 2003).

Preliminary version "Learning Juntas" in 35th Annual Symposium on Theory of Computing (STOC), 2003, pp. 206-212.

We consider a fundamental problem in computational learning theory: learning an arbitrary Boolean function which depends on an unknown set of $k$ out of $n$ Boolean variables. We give an algorithm for learning such functions from uniform random examples which runs in time roughly $(n^k)^{{\frac \omega {\omega + 1}}},$ where $\omega < 2.376$ is the matrix multiplication exponent. We thus obtain the first polynomial factor improvement on the naive $n^k$ time bound which can be achieved via exhaustive search. Our algorithm and analysis exploit new structural properties of Boolean functions.

**Note: (August '06)** Eric Bach and Lisa Hellerstein have
informed us that Theorem 12 was proved earlier by T. Siegenthaler,
*Transactions on Information Theory* 30(5), 1984. This
result can also be used in place of Theorem 13.