Stochastic Systems Group
Home Research Group Members Programs  
Demos Calendar Publications Mission Statement Alumni

SSG Seminar Abstract


On the Consistency of Boosting Algorithms


Dr. Shie Mannor
LIDS, MIT


Boosting is a general approach for constructing a complex classifier by an incremental procedure based on a sequence of so-called weak learners. While each weak learner is only able to do marginally (but consistently) better than random guessing, the composite classifier constructed often performs very well. In this talk we will present some results concerning the requirements needed to guarantee that the composite classifier is consistent - it ultimately attains the minimal (Bayes) error. While it is known that Boosting may not be consistent in general, we will concentrate on linear weak learners and provide geometrical conditions under which the Boosting classification algorithm is consistent.



Problems with this site should be emailed to jonesb@mit.edu