Improved Algorithm Based on Low Rank Representation for Subspace Clustering
Funds:
The National Natural Science Foundation of China (61473154)
-
摘要: 該文針對(duì)現(xiàn)有的基于低秩表示的子空間聚類算法使用核范數(shù)來代替秩函數(shù),不能有效地估計(jì)矩陣的秩和對(duì)高斯噪聲敏感的缺陷,提出一種改進(jìn)的算法,旨在提高算法準(zhǔn)確率的同時(shí),保持其在高斯噪聲下的穩(wěn)定性。在構(gòu)建目標(biāo)函數(shù)時(shí),使用系數(shù)矩陣的核范數(shù)和Forbenius范數(shù)作為正則項(xiàng),對(duì)系數(shù)矩陣的奇異值進(jìn)行強(qiáng)凸的正則化后,采用非精確的增廣拉格朗日乘子方法求解,最后對(duì)求得的系數(shù)矩陣進(jìn)行后處理得到親和矩陣,并采用經(jīng)典的譜聚類方法進(jìn)行聚類。在人工數(shù)據(jù)集、Extended Yale B數(shù)據(jù)庫和PIE數(shù)據(jù)庫上同流行的子空間聚類算法的實(shí)驗(yàn)對(duì)比證明了所提改進(jìn)算法的有效性和對(duì)高斯噪聲的魯棒性。
-
關(guān)鍵詞:
- 子空間聚類 /
- 低秩表示 /
- 秩函數(shù) /
- Forbenius范數(shù) /
- 增廣拉格朗日乘子法
Abstract: The nuclear norm is used to replace the rank function in the subspace clustering algorithm based on low rank representation, it can not estimate the rank of the matrix effectively and it is sensitive to Gauss noise. In this paper, a novel algorithm is proposed to improve the accuracy and maintain its stability under the Gauss noise. When building the objective function, the nuclear norm and Forbenius norm of coefficient matrix are used as the regularization terms, after a strong convex regularizer over singular values of coefficient matrix, an inexact augmented Lagrange multiplier method is utilized to solve the problem. Finally, the affinity matrix is acquired by post-processing of coefficient matrix and the classical spectral clustering method is employed to clustering. The experimental comparison results between the state-of-the-art algorithms on synthetic data, Extended Yale B and PIE datasets demonstrate the effectiveness of the proposed improved method and the robustness to Gauss noise. -
王衛(wèi)衛(wèi), 李小平, 馮象初, 等. 稀疏子空間聚類綜述[J]. 自動(dòng)化學(xué)報(bào), 2015, 41(8): 1373-1384. doi: 10.16383/j.aas.2015. c140891. WANG Weiwei, LI Xiaoping, FENG Xiangchu, et al. A survey on sparse subspace clustering[J]. Acta Automatica Sinica, 2015, 41(8): 1373-1384. doi: 10.16383/j.aas.2015.c140891. VIDAL R. Subspace clustering[J]. IEEE Signal Processing, 2011, 28(2): 52-68. doi: 10.1109/MSP.2010.939739. ELHAMIFAR E and VIDAL R. Sparse subspace clustering: Algorithm, theory, and applications[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 34(11): 2765-2781. doi: 10.1109/TPAMI.2013.57. VIDAL R and FAVARO P. Low rank subspace clustering (LRSC)[J]. Pattern Recognition Letters, 2014, 43: 47-61. doi: 10.1016/j.patrec.2013.08.006. LIU G C, LIN Z C, and YU Y. Robust subspace segmentation by low-rank representation[C]. Proceedings of the International Conference on Machine Learning, Haifa, Israel, 2010: 663-670. LIU G C, LIN Z C, YAN S C, et al. Robust recovery of subspace structures by low-rank representation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35(1): 171-184. doi: 10.1109/TPAMI.2012.88. FENG J S, LIN Z C, XU H, et al. Robust subspace segmentation with block-diagonal prior[C]. Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, Columbus, USA, 2014: 3818-3825. doi: 10.1109/CVPR.2014.482. ZHANG X, SUN F, LIU G C, et al. Fast low-rank subspace segmentation[J]. IEEE Transactions on Knowledge and Data Engineering, 2014, 26(5): 1293-1297. doi: 10.1109/TKDE. 2013.114. PATEL V M, NGUYEN H V, and VIDAL R. Latent space sparse and low-rank subspace clustering[J]. IEEE Journal of Selected Topics in Signal Processing, 2015, 9(4): 691-701. doi: 10.1109/JSTSP.2015.2402643. LIU J M, CHEN Y J, ZHANG J S, et al. Enhancing low-rank subspace clustering by manifold regularization[J]. IEEE Transactions on Image Processing, 2014, 23(9): 4022-4030. doi: 10.1109/TIP.2014.2343458. YIN M, GAO J B, LIN Z C, et al. Dual graph regularized latent low-rank representation for subspace clustering[J]. IEEE Transactions on Image Processing, 2015, 24(12): 4918-4933. doi: 10.1109/TIP.2015.2472277. MOL C D, VITO E D, and ROSASCO L. Elastic-net regularization in learning theory[J]. Journal of Complexity, 2009, 25(2): 201-230. doi: 10.1016/j.jco.2009. 01.002. LIN Z C, CHEN M M, and MA Y. The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices[R]. UIUC Technical Report UILU-ENG-09-2215, 2009. NG A Y, JORDAN M, and WEISS Y. On spectral clustering: Analysis and an algorithm[C]. Proceedings of the Advances in Neural Information Processing Systems, Vancouver, Canada, 2001: 849-856. LI H, CHEN N, and LI L Q. Error analysis for matrix elastic-net regularization algorithms[J]. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(5): 737-748. doi: 10.1109/TNNLS.2012.2188906. CAI J F, CANDES E J, and SHEN Z W. A singular value thresholding algorithm for matrix completion[J]. SIAM Journal on Control and Optimization, 2010, 20(4): 1956-1982. doi: 10.1137/080738970. SHI J B and MALIK J. Normalized cuts and image segmentation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(8): 888-905. doi: 10.1109/ 34.868688. BERTSEKAS D. Constrained Optimization and Lagrange Multiplier Methods[M]. Belmont, MA, USA: Athena Scientific, 1996: 326-340. CANDES E J, LI X D, MA Y, et al. Robust principal component analysis[J]. Journal of the ACM, 2010, 58(1): 1-37. doi: 10.1145/1970392.1970395. YAN J Y and POLLEFEYS M. A general framework for motion segmentation: Independent, articulated, rigid, non-rigid, degenerate and non-degenerate[C]. Proceedings of the European Conference on Computer Vision, Graz, Austria, 2006: 94-106. doi: 10.1007/11744085_8. CHEN J L and LERMAN G. Spectral curvature clustering (SCC)[J]. International Journal of Computer Vision, 2009, 81(3): 317-330. doi: 10.1007/s11263-008-0178-9. LU C Y, MIN H, ZHAO Z Q, et al. Robust and efficient subspace segmentation via least squares regression[C]. Proceedings of the European Conference on Computer Vision, Florence, Italy, 2012: 347-360. doi: 10.1007/978-3-642- 33786-4_26. -
計(jì)量
- 文章訪問數(shù): 1476
- HTML全文瀏覽量: 126
- PDF下載量: 689
- 被引次數(shù): 0