Ensemble methods, which construct many classifiers and combine them to make a final decision, have shown great success in statistics and machine learning areas for their significant improvements in classification accuracy. Bagging (Breiman, 1996) and boosting (Freund and Schapire, 1997) are two most popular ensemble methods. Many comparison studies for bagging and boosting have been performed and their results indicate that even though boosting is more accurate than bagging in most cases, boosting may overfit highly noisy data sets, thus decreasing its performance.
In this talk, I introduce a new ensemble algorithm called ``Convex Hull Ensemble Machine (CHEM).'' CHEM in Hilbert space is presented first and it is modified to regression and classification problems. Empirical studies reveal that in classification problems CHEM has similar prediction accuracy as boosting, but CHEM is much more robust to output noise and never overfits data sets even when boosting does. In regression problems, CHEM works competitively with other ensemble methods such as gradient boosting and bagging.
Meet the speaker in Room 212 Cockins Hall at 4:30 p.m. Refreshments will be served.