We give two results on PAC learning DNF formulas using membership queries in the challenging ``distribution-free'' learning framework, where learning algorithms must succeed for an arbitrary and unknown distribution over $\{0,1\}^n$.
(1) We first give a quasi-polynomial time ``list-decoding'' algorithm for learning a \emph{single term} of an unknown DNF formula. More precisely, for any target $s$-term DNF formula $f = T_1 \vee \cdots \vee T_s$ over $\zo^n$ and any unknown distribution ${\cal D}$ over $\zo^n$, our algorithm, which uses membership queries and random examples from ${\cal D}$, runs in $\quasipoly(n,s)$ time and outputs a list $\calL$ of candidate terms such that with high probability some term $T_i$ of $f$ belongs to $\calL$.
(2) We then use result (1) to give a $\quasipoly(n,s)$-time algorithm, in the distribution-free PAC learning model with membership queries, for learning the class of size-$s$ DNFs in which all terms have the same size. Our algorithm learns using a DNF hypothesis.
The key tool used to establish result (1) is a new result on ``locally mixing random walks,'' which, roughly speaking, shows that a random walk on a graph that is covered by a small number of expanders has a non-negligible probability of mixing quickly in a subset of these expanders.