Abstract
We propose a new, robust boosting method by using a siginoidal function as a loss function. In deriving the method, the stagewise additive modelling methodology is blended with the gradient descent algorithms. Based on intensive numerical experiments, we show that the proposed method is actually better than AdaBoost and other regularized method in test error rates in the case of noisy, mislabeled situation.
Original language | English |
---|---|
Pages (from-to) | 182-196 |
Number of pages | 15 |
Journal | Journal of the Operations Research Society of Japan |
Volume | 47 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2004 Sept |
Externally published | Yes |
Keywords
- AdaBoost
- Boosting
- Data analysis
- Data mining
- Machine learning
- Sigmoidal function
ASJC Scopus subject areas
- Decision Sciences(all)
- Management Science and Operations Research