P. Long and R. Servedio.
22st Annual Conference in Neural Information Processing Systems (NIPS), 2008, pp. 977-984.

Abstract:

In recent work Long and Servedio \cite{LS05short} presented a martingale boosting'' algorithm that works by constructing a branching program over weak classifiers and has a simple analysis based on elementary properties of random walks. \cite{LS05short} showed that this martingale booster can tolerate random classification noise when it is run with a noise-tolerant weak learner; however, a drawback of the algorithm is that it is not \emph{adaptive}, i.e. it cannot effectively take advantage of variation in the quality of the weak classifiers it receives.

In this paper we present a variant of the original martingale boosting algorithm and prove that it is adaptive. This adaptiveness is achieved by modifying the original algorithm so that the random walks that arise in its analysis have different step size depending on the quality of the weak learner at each stage. The new algorithm inherits the desirable properties of the original \cite{LS05short} algorithm, such as random classification noise tolerance, and has several other advantages besides adaptiveness: it requires polynomially fewer calls to the weak learner than the original algorithm, and it can be used with confidence-rated weak hypotheses that output real values rather than Boolean predictions.