一级黄色片免费播放|中国黄色视频播放片|日本三级a|可以直接考播黄片影视免费一级毛片

高級搜索

留言板

尊敬的讀者、作者、審稿人, 關(guān)于本刊的投稿、審稿、編輯和出版的任何問題, 您可以本頁添加留言。我們將盡快給您答復(fù)。謝謝您的支持!

姓名
郵箱
手機號碼
標題
留言內(nèi)容
驗證碼

基于函數(shù)集信息量的模型選擇研究

盛守照 王道波 王志勝 黃向華

盛守照, 王道波, 王志勝, 黃向華. 基于函數(shù)集信息量的模型選擇研究[J]. 電子與信息學(xué)報, 2005, 27(4): 552-555.
引用本文: 盛守照, 王道波, 王志勝, 黃向華. 基于函數(shù)集信息量的模型選擇研究[J]. 電子與信息學(xué)報, 2005, 27(4): 552-555.
Sheng Shou-zhao, Wang Dao-bo, Wang Zhi-sheng, Huang Xiang-hua. Research on Model Selection Based on Function Set Information Quantity[J]. Journal of Electronics & Information Technology, 2005, 27(4): 552-555.
Citation: Sheng Shou-zhao, Wang Dao-bo, Wang Zhi-sheng, Huang Xiang-hua. Research on Model Selection Based on Function Set Information Quantity[J]. Journal of Electronics & Information Technology, 2005, 27(4): 552-555.

基于函數(shù)集信息量的模型選擇研究

Research on Model Selection Based on Function Set Information Quantity

  • 摘要: 提出了子空間信息量(SIQ)和函數(shù)集信息量(FSIQ)概念,詳細討論了基于函數(shù)集信息量的模型選擇問題,給出了有限含噪聲樣本下模型選擇的近似解決方法,很好地克服了模型選擇過程中普遍存在的欠學(xué)習(xí)和過學(xué)習(xí)問題,大大提高了預(yù)測模型的泛化性能,在此基礎(chǔ)上提出了一種可行的次優(yōu)模型選擇算法。最后通過具體實例驗證了上述方法的可行性和優(yōu)越性。
  • 張學(xué)工.關(guān)于統(tǒng)計學(xué)習(xí)理論與支持向量機.自動化學(xué)報,2000,26(1):32-42.[2]Sugiyama M, Ogawa H. Subspace information criterion for model selection[J].Neural Computation.2001, 13(8):1863-[3]Stolke A. Bayesian learning of probabilistic language models.[Ph.D. Dissertation], University of California, Berkeley, 1994.[4]Hemant Ishwaran, Lancelot F, Jiayang Sun. Bayesian model selection in finite mixtures by marginal density decompositions[J].Journal of the American Statistical Association.2001, 96(456):1316-[5]Cherkassky V, Shao X, Mulier F M, Vapnik V N. Model complexity control for regression using VC generalization bounds[J].IEEE Trans. on Neural Networks.1999, 10(5):1075-[6]Barron A R, Cover T M. Minimum complexity density estimation[J].IEEE Trans. onInformation Theory.1991, 37(4):1034-[7]Yamanishi K. A decision-theoretic extension of stochastic complexity and its application to learning[J].IEEE Trans. on Information Theory.1998, 44(4):1424-[8]Wood S N. Modelling and smoothing parameter estimation with multiple quadratic penalties[J].J. Royel Statist. Soc. B.2000, 62(1):413-[9]Chapelle O, Vapnik V N, Bengio Y. Model selection for small-sample regression[J].Machine Learning Journal.2002, 48(1):9-[10]Hurvich C M, Tsai C L. Regression and time series model selection in small samples[J].Biometrika.1989, 76(13):297-[11]Bousquet O, Elisseeff A. Stability and generalization. Journal of Machine Learning Research 2, 2002:499 - 526.[12]Konishi S, Kitagawa G. Generalized information criterion in model selection[J].Biometrika.1996, 83(4):875-[13]Akaike H. A new look at the statistical model identification. IEEE Trans. on Automatic Control, 1974, AC-19(6): 716 - 723.[14]Hurvich C M, Tsai C L. Bias of the corrected AIC criterion for under-fitted regression and time series models. Biometrika, 1991,78(2): 499 - 509.[15]Murata N, Yoshizawa S, Amari S. Network information criterion-determining the number of hidden units for an artificial neural network model[J].IEEE Trans. on Neural Networks.1994,5(6):865-[16]Shibata R. Bootstrap estimate of Kullback-Leibler information for model selection. Statistica Sinica, 1997, 7(2): 375 - 394.
  • 加載中
計量
  • 文章訪問數(shù):  2346
  • HTML全文瀏覽量:  88
  • PDF下載量:  665
  • 被引次數(shù): 0
出版歷程
  • 收稿日期:  2003-12-09
  • 修回日期:  2004-03-26
  • 刊出日期:  2005-04-19

目錄

    /

    返回文章
    返回