基于多尺度重采樣思想的類指數(shù)核函數(shù)構(gòu)造
doi: 10.11999/JEIT151101
-
2.
(清華大學航空航天學院流體力學所 北京 100084) ②(中國空氣動力研究與發(fā)展中心 綿陽 621000) ③(大慶地區(qū)防洪工程管理處 大慶 163311)
基金項目:
國家自然科學基金(11472158)
Design of An Exponential-like Kernel Function Based on Multi-scale Resampling
-
2.
(Fluid Dynamics Institute, School of Aerospace, Tsinghua University, Beijing 100084, China)
Funds:
The National Natural Science Foundation of China (11472158)
-
摘要: 該文按照多尺度重采樣思想,構(gòu)造了一種類指數(shù)分布的核函數(shù)(ELK),并在核回歸分析和支持向量機分類中進行了應用,發(fā)現(xiàn)ELK對局部特征具有捕捉優(yōu)勢。ELK分布僅由分析尺度決定,是單參數(shù)核函數(shù)。利用ELK對階躍信號和多普勒信號進行Nadaraya-Watson回歸分析,結(jié)果顯示ELK降噪和階躍捕捉效果均優(yōu)于常規(guī)Gauss核,整體效果接近或優(yōu)于局部加權(quán)回歸散點平滑法(LOWESS)。多個UCI數(shù)據(jù)集的SVM分析顯示,ELK與徑向基函數(shù)(RBF)分類效果相當,但比RBF具有更強的局域性,因此具有更細致的分類超平面,同時分類不理想時可能產(chǎn)生更多的支持向量。對比而言,ELK對調(diào)節(jié)參數(shù)敏感性低,這一性質(zhì)有助于減少參數(shù)優(yōu)選的計算量。單參數(shù)的ELK對局域特征的良好捕捉能力,有助于這類核函數(shù)在相關(guān)領(lǐng)域得到推廣。
-
關(guān)鍵詞:
- 多尺度重采樣 /
- Nadaraya-Watson回歸 /
- 支持向量機 /
- 類指數(shù)核函數(shù)
Abstract: Based on multi-scale resampling, an Exponential-Like Kernel (ELK) function is designed, and evaluated with local feature extraction in kernel regression and Support Vector Machine (SVM) classification. The ELK is a one-parameter kernel, whose distribution is controlled only by the resolution of analysis. With block and Doppler noisy signals, Nadaraya-Watson regression with ELK mainly shows more noise and step error than with Gaussian kernel, it also has better precision and is more robust than LOcally WEighted Scatterplot Smoothing (LOWESS). Data sets from the UCI Machine Learning Repository used in SVM test demonstrate that, ELK has nearly equal classification accuracy as RBF does, and its locality results in more detailed margin hyperplanes, in consequence, a big number of support vectors in low classification accuracy situation. Moreover, the insensitivity?of ELK to the adjustive coefficient in kernel methods shows the potential to facilitate the parameter optimization progress. ELK, as a single parameter kernel with significant locality, is hopefully to be extensively used in relative kernel methods. -
SMOLA A J and SCHLKOPF B. On a kernel-based method for pattern recognition, regression, approximation, and operator inversion[J]. Lgorithmica, 1998, 22(1): 211-231. doi: 10.1007/PL00013831. KOHLER M, SCHINDLER A, and SPERLICH S. A review and comparison of bandwidth selection methods for kernel regression [J]. International Statistical Review, 2014, 82(2): 243-274. doi: 10.1111/insr.12039. DAS D, DEVI R, PRASANNA S, et al. Performance comparison of online handwriting recognition system for assamese language based on HMM and SVM modelling[J]. International Journal of Computer Science Information Technology, 2014, 6(5): 87-95. doi: 10.5121/csit.2014.4717. HASTIE T and LOADER C. Local regression: Automatic kernel carpentry[J]. Statistical Science, 1993, 8(2): 120-143. doi: 10.1214/ss/1177011002. SCHLKOPF B, SMOLA A, and MLLER K R. Nonlinear component analysis as a kernel eigenvalue problem[J]. Neural Computation, 1998, 10(5): 1299-1319. doi: 10.1162/ 089976698300017467. BUCAK S S, JIN R, and JAIN A K. Multiple kernel learning for visual object recognition: A review [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(7): 1354-1369. doi: 10.1109/TPAMI.2013.212. BACH F R, LANCKRIET G R, and JORDAN M I. Multiple kernel learning, conic duality, and the SMO algorithm[C]. Proceedings of the Twenty-first International Conference on Machine Learning, Banff, Canada, 2004: 1-6. doi: 10.1145/1015330.1015424. 吳濤, 賀漢根, 賀明科. 基于插值的核函數(shù)構(gòu)造[J]. 計算機學報, 2003, 26(8): 990-996. WU Tao, HE Hangen, and HE Mingke. Interpolation based kernel functions construction[J]. Chinese Journal of Computers, 2003, 26(8): 990-996. JAIN P, KULIS B, and DHILLON I S. Inductive regularized learning of kernel functions[C]. Proceedings of the Advances in Neural Information Processing Systems, Vancouver, Canada, 2010: 946-954. ZHANG L, ZHOU W, and JIAO L. Wavelet support vector machine[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2004, 34(1): 34-39. doi: 10.1109/TSMCB.2003.811113. NADARAYA E A. On estimating regression[J]. Theory of Probability Its Applications, 1964, 9(1): 141-142. doi: 10.1137/1109020. WASSERMAN L著, 吳喜之譯. 現(xiàn)代非參數(shù)統(tǒng)計[M]. 北京: 科學出版社, 2008: 163-179. WATSON G S. Smooth regression analysis[J]. Sankhyā: The Indian Journal of Statistics, Series A, 1964, 26(4): 359-372. CRISTIANINI N and SHAWE-TAYLOR J. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods[M]. Cambridge: Cambridge University Press, 2000: 93-112. VLADIMIR V N and VAPNIK V. Statistical Learning Theory[M]. New York: Wiley, 1998: 293-394. CHANG C C and LIN C J. LIBSVM: A library for support vector machines[J]. ACM Transactions on Intelligent Systems and Technology, 2011, 2(3): 27. doi: 10.1145 /1961189.1961199. REN Y and BAI G. Determination of optimal SVM parameters by using GA/PSO[J]. Journal of Computers, 2010, 5(8): 1160-1168. doi: 10.4304/jcp.5.8.1160-1168. SARAFIS I, DIOU C, and TSIKRIKA T. Weighted SVM from click through data for image retrieval[C]. 2014 IEEE International Conference on Image Processing (ICIP 2014), Paris, 2014: 3013-3017. doi: 10.1109/ICIP.2014.7025609. SONKA M, HLAVAC V, and BOYLE R. Image Processing, Analysis, and Machine Vision[M]. Kentucky: Cengage Learning, 2014: 257-749. doi: 10.1007/978-1-4899-3216-7. SELLAM V and JAGADEESAN J. Classification of normal and pathological voice using SVM and RBFNN[J]. Journal of Signal and Information Processing, 2014, 5(1): 1-7. doi: 10. 4236/jsip.2014.51001. 高晉占. 微弱信號檢測[M]. 北京: 清華大學出版社有限公司, 2004: 154-299. GAO Jinzhan. Weak Signal Detection[M]. Beijing: Tsinghua University Press Ltd., 2004: 154-299. MERCER J. Functions of positive and negative type, and their connection with the theory of integral equations[J]. Philosophical Transactions of the Royal Society of London Series A, Containing Papers of a Mathematical or Physical Character, 1909, 209(456): 415-446. doi: 10.1098/rsta.1909. 0016. MARRON J and CHUNG S. Presentation of smoothers: The family approach[J]. Computational Statistics, 2001, 16(1): 195-207. doi: 10.1007/s001800100059. -
計量
- 文章訪問數(shù): 1487
- HTML全文瀏覽量: 154
- PDF下載量: 570
- 被引次數(shù): 0