Nikunj C. Oza's Publications

Sorted by DateClassified by Publication TypeClassified by Research Category

Boosting with Averaged Weight Vectors

Boosting with Averaged Weight Vectors. Nikunj C. Oza. In Fourth International Workshop on Multiple Classifier Systems, pp. 15–24, Springer-Verlag, Guildford, UK, June 2003.

Download

(unavailable)

Abstract

AdaBoost is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence. The idea is to make the next base model's errors uncorrelated with those of the previous model. Some researchers have pointed out the intuition that it is probably better to construct a distribution orthogonal to the mistake vectors of all the previous base models, but that this is not always possible. We present an algorithm that attempts to come as close as possible to this goal in an efficient manner. We present experimental results demonstrating significant improvement over AdaBoost and the Totally Corrective boosting algorithm, which also attempts to satisfy this goal.

BibTeX Entry

@inproceedings{oza03,	
	author={Nikunj C. Oza},
        title={Boosting with Averaged Weight Vectors},
        booktitle={Fourth International Workshop on Multiple Classifier Systems}, 
        publisher={Springer-Verlag},
        address={Guildford, UK},
        editor={Fabio Roli, Josef Kittler, and Terry Windeatt},
        pages={15-24},
        month={June},
abstract={AdaBoost is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step in AdaBoost is constructing a distribution over the training examples to create each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by the previous base model in the sequence. The idea is to make the next base model's errors uncorrelated with those of the previous model. Some researchers have pointed out the intuition that it is probably better to construct a distribution orthogonal to the mistake vectors of all the previous base models, but that this is not always possible. We present an algorithm that attempts to come as close as possible to this goal in an efficient manner. We present experimental results demonstrating significant improvement over AdaBoost and the Totally Corrective boosting algorithm, which also attempts to satisfy this goal.},
        bib2html_pubtype={Refereed Conference},
	bib2html_rescat={Ensemble Learning},
	year={2003}
}

Generated by bib2html.pl (written by Patrick Riley ) on Sun Jan 13, 2008 22:02:08