一级黄色片免费播放|中国黄色视频播放片|日本三级a|可以直接考播黄片影视免费一级毛片

高級搜索

留言板

尊敬的讀者、作者、審稿人, 關(guān)于本刊的投稿、審稿、編輯和出版的任何問題, 您可以本頁添加留言。我們將盡快給您答復(fù)。謝謝您的支持!

姓名
郵箱
手機號碼
標(biāo)題
留言內(nèi)容
驗證碼

基于弱分類器調(diào)整的多分類Adaboost算法

楊新武 馬壯 袁順

楊新武, 馬壯, 袁順. 基于弱分類器調(diào)整的多分類Adaboost算法[J]. 電子與信息學(xué)報, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544
引用本文: 楊新武, 馬壯, 袁順. 基于弱分類器調(diào)整的多分類Adaboost算法[J]. 電子與信息學(xué)報, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544
YANG Xinwu, MA Zhuang, YUAN Shun. Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544
Citation: YANG Xinwu, MA Zhuang, YUAN Shun. Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier[J]. Journal of Electronics & Information Technology, 2016, 38(2): 373-380. doi: 10.11999/JEIT150544

基于弱分類器調(diào)整的多分類Adaboost算法

doi: 10.11999/JEIT150544

Multi-class Adaboost Algorithm Based on the Adjusted Weak Classifier

  • 摘要: Adaboost.M1算法要求每個弱分類器的正確率大于1/2,但在多分類問題中尋找這樣的弱分類器較為困難。有學(xué)者提出了多類指數(shù)損失函數(shù)的逐步添加模型(SAMME),把弱分類器的正確率要求降低到大于1/k(k為類別數(shù)),降低了尋找弱分類器的難度。由于SAMME算法無法保證弱分類器的有效性,從而并不能保證最終強分類器正確率的提升。為此,該文通過圖示法及數(shù)學(xué)方法分析了多分類Adaboost算法的原理,進而提出一種新的既可以降低弱分類器的要求,又可以確保弱分類器有效性的多分類方法。在UCI數(shù)據(jù)集上的對比實驗表明,該文提出的算法的結(jié)果要好于SAMME算法,并達到了不弱于Adaboost.M1算法的效果。
  • VALIANT L G. A theory of the learnable[J]. Communications of the ACM, 1984, 27(11): 1134-1142. doi: 10.1145/800057.808710.
    SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2): 197-227. doi: 10.1007/ BF00116037.
    FREUND Y. Boosting a weak learning algorithm by majority [J]. Information and Computation, 1995, 121(2): 256-285. doi: 10.1006/inco.1995.1136.
    SCHAPIRE R E. A brief introduction to boosting[C]. International Joint Conference on Artificial Intelligence, Sweden, 1999: 1401-1406.
    曹瑩, 苗啟廣, 劉家辰, 等. AdaBoost 算法研究進展與展望[J]. 自動化學(xué)報, 2013, 39(6): 745-758. doi: 10.3724/SP.J. 1004.2013.00745.
    CAO Ying, MIAO Qiguang, LIU Jiachen, et al. Advance and prospects of AdaBoost algorithm[J]. Acta Automatica Sinica, 2013, 39(6): 745-758. doi: 10.3724/SP.J.1004.2013.00745.
    FREUND Y and SCHAPIRE R E. A desicion-theoretic generalization of on-line learning and an application to boosting[J]. Lecture Notes in Computer Science, 1970, 55(1): 23-37. doi: 10.1007/3-540-59119-2_166.
    NEGRI P, GOUSSIES N, and LOTITO P. Detecting pedestrians on a movement feature space[J]. Pattern Recognition, 2014, 47(1): 56-71. doi: 10.1016/j.patcog. 2013.05.020.
    LIU L, SHAO L, and ROCKETT P. Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition[J]. Pattern Recognition, 2013, 46(7): 1810-1818. doi: 10.1016/j.patcog. 2012.10.004.
    FREUND Y and SCHAPIRE R E. Experiments with a new boosting algorithm[C]. Proceedings of the Thirteenth International Conference on Machine Learning, Italy, 1996: 148-156.
    ALLWEIN E L, SCHAPIRE R E, and SINGER Y. Reducing multiclass to binary: a unifying approach for margin classifiers[J]. The Journal of Machine Learning Research, 2001, 1(2): 113-141. doi: 10.1162/15324430152733133.
    MUKHERJEE I and SCHAPIRE R E. A theory of multiclass boosting[J]. The Journal of Machine Learning Research, 2013, 14(1): 437-497.
    SCHAPIRE R E and SINGER Y. Improved boosting algorithms using confidence-rated predictions[J]. Machine Learning, 1999, 37(3): 297-336. doi: 10.1145/279943.279960.
    涂承勝, 刁力力, 魯明羽, 等. Boosting 家族 AdaBoost 系列代表算法[J]. 計算機科學(xué), 2003, 30(3): 30-34.
    TU Chengsheng, DIAO Lili, LU Mingyu, et al. The typical algorithm of AdaBoost series in Boosting family[J]. Computer Science, 2003, 30(3): 30-34.
    胡金海, 駱廣琦, 李應(yīng)紅, 等. 一種基于指數(shù)損失函數(shù)的多類分類 AdaBoost 算法及其應(yīng)用[J]. 航空學(xué)報, 2008, 29(4): 811-816.
    HU Jinhai, LUO Guangqi, LI Yinghong, et al. An AdaBoost algorithm for multi-class classification based on exponential loss function and its application[J]. Acta Aeronautica et Astronautica Sinica, 2008, 29(4): 811-816.
    ZHU J, ZOU H, ROSSET S, et al. Multi-class adaboost[J]. Statistics and Its Interface, 2009, 2(3): 349-360.
    FRIEDMAN J, HASTIE T, and TIBSHIRANI R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors)[J]. The Annals of Statistics, 2000, 28(2): 337-407.doi: 10.1214/aos/1016120463.
    付忠良. 關(guān)于 AdaBoost 有效性的分析[J]. 計算機研究與發(fā)展, 2008, 45(10): 1747-1755.
    FU Zhongliang. Effectiveness analysis of AdaBoost[J]. Journal of Computer Research and Development, 2008, 45(10): 1747-1755.
    KUZNETSOV V, MOHRI M, and SYED U. Multi-class deep boosting[C]. Advances in Neural Information Processing Systems, Canada, 2014: 2501-2509.
    CORTES C, MOHRI M, and SYED U. Deep boosting[C]. Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, 2014: 1179-1187.
    ZHAI S, XIA T, and WANG S. A multi-class boosting method with direct optimization[C]. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, 2014: 273-282.
    ZHAI S, XIA T, TAN M, et al. Direct 0-1 loss minimization and margin maximization with boosting[C]. Advances in Neural Information Processing Systems, Nevada, 2013: 872-880.
    DOLLAR P. Quickly boosting decision trees-pruning underachieving features early[C]. JMLR Workshop Conference Proceedings, 2013, 28: 594-602.
    FERNANDEZ B A and BAUMELA L. Multi-class boosting with asymmetric binary weak-learners[J]. Pattern Recognition, 2014, 47(5): 2080-2090. doi: 10.1016/j.patcog. 2013.11.024.
  • 加載中
計量
  • 文章訪問數(shù):  2036
  • HTML全文瀏覽量:  263
  • PDF下載量:  2224
  • 被引次數(shù): 0
出版歷程
  • 收稿日期:  2015-05-11
  • 修回日期:  2015-10-08
  • 刊出日期:  2016-02-19

目錄

    /

    返回文章
    返回