Enhanced Similarity Matrix Learning for Multi-View Clustering
Abstract
1. Introduction
- A novel multi-view clustering method is proposed to obtain more complementary information from all the multi-view data, named Enhanced Similarity Matrix Learning for multi-view clustering (ES-MVC).
- Our method leverages both local and global structures across multiple views to construct a consistent graph for enhanced clustering performance.
- Furthermore, we apply a rank constraint to the similarity matrix in order to dynamically optimize neighbor assignment for improved clustering results.
- Additionally, we develop a robust optimization algorithm to efficiently solve our objective function.
2. Related Works
2.1. Similarity Matrix Learning for Multi-View Clustering
2.2. Robust Multi-View Clustering
3. Our Method
3.1. Motivation
3.2. ES-MVC Method
3.2.1. Local Structure in a Single View
3.2.2. Global Structure Across Each Different View
3.2.3. ES-MVC Objective Function
3.2.4. Robust Initial Graph Construction
3.3. Optimization
- Update and , while fixingTo be specific, is known in the ()-th iteration, so we can easily calculate by Equation (6), and update by minimizing the objective function Equation (10). In this case, the second term and the third term in the objective function Equation (10) become constant. Thus, the objective function Equation (10) is converted to solve the following objective functionAccording to the matrix theory, the column vectors of the optimal indicator matrix of Equation (11) are composed of the c eigenvectors of corresponding to the c smallest eigenvalues.
- Update while fixing andAfter that, we can update while fixing and . Denote , .Through simple algebra, for each i, we have Equation (13)
Algorithm 1 ES-MVC |
Input: , , class number c. Initialize: , which is initial graph. while not converge do 1. Calculate the matrix and for each view. 2. Calculate . 3. Update the cluster indicator matrix 4. For each i, update . 5. Update t: . end while Compute the clustering labels by running K-means on or spectral clustering on . Output: The final clustering result. |
3.4. Convergence Analysis
4. Experiments
4.1. Experiments on the MSRC-v1 Dataset
4.2. Experiments on the Caltech101 Dataset
4.3. Experiments on the HW Dataset
4.4. Experiments on a Visual and Text Dataset
4.5. Experiments on Two Text Datasets
4.6. Experimental Analysis
- From Table 2, Table 3, Table 4, Table 5, Table 6 and Table 7, it is evident that nearly all multi-view clustering methods outperform the best single-view method based on SC [3]. This demonstrates the superior effectiveness of multi-view methods in these scenarios. The results further suggest that multi-view data offers richer information, and the different views complement each other.
- The results in Table 5 also prove that our method can be applied to multi-model datasets and achieve a promising performance.
- The results of Table 6 and Table 7 also prove that our method can be applied to text dataset, and our method outperforms other state-of-the-art methods.Figure 6 visualizes the similarity matrices for the MSRC-v1 dataset. The first three matrices are the initial local similarity matrices for views 1, 2, and 3. The last three show matrices learned by our method using global structure, local structure, and both. Each matrix has seven diagonal blocks representing seven classes, with values indicating affinities within clusters. A clearer diagonal block with more nonzero points signifies better clustering performance. Our method shows a clear block diagonal structure, with more affinity points on the diagonal, resulting in superior clustering. The sixth block, highlighted in red, demonstrates the effectiveness of our method compared to those using only global or local structure.
- Figure 7 shows the value of parameter vs. ACC results on the MSRC-v1 dataset. From Figure 7, we observe that when is zero or infinite, the clustering performance is not the best. Figure 6 also proves this. We notice from Figure 6 that the diagonal block of the learned similarity matrix from our method is more complete than others. This illustrates that using global structure or local structure separately cannot obtain the desired similarity structure. It further shows that the local manifold structure within each view alone is not enough for clustering tasks.
- Figure 8 shows the convergence curve of our proposed method vs. the number of iterations on four datasets. It demonstrates that the solution of our method is efficient. In particular, we can see that our method converges in fewer than five iterations.
4.7. Advantages and Limitations
5. Conclusions and Future Works
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhao, J.; Xie, X.; Xu, X.; Sun, S. Multi-view learning overview: Recent progress and new challenges. Inf. Fusion 2017, 38, 43–54. [Google Scholar] [CrossRef]
- Sun, S. A survey of multi-view machine learning. Neural Comput. Appl. 2013, 23, 2031–2038. [Google Scholar] [CrossRef]
- Ng, A.Y.; Jordan, M.I.; Weiss, Y. On spectral clustering: Analysis and an algorithm. In Proceedings of the NIPS, Vancouver, BC, Canada, 9–14 December 2002; pp. 849–856. [Google Scholar]
- Duan, Y.; Nie, F.; Wang, R.; Li, X. Harmonic cut: An efficient and directly solved balanced graph clustering. Neurocomputing 2024, 578, 127381. [Google Scholar] [CrossRef]
- Sun, J.; Bi, J.; Kranzler, H.R. Multi-view biclustering for genotype-phenotype association studies of complex diseases. In Proceedings of the IEEE BIBM, Shanghai, China, 18–21 December 2013; pp. 316–321. [Google Scholar]
- Jing, X.Y.; Wu, F.; Dong, X.; Shan, S.; Chen, S. Semi-Supervised Multi-View Correlation Feature Learning with Application to Webpage Classification. In Proceedings of the AAAI, San Francisco, CA, USA, 4–9 February 2017; pp. 1374–1381. [Google Scholar]
- Tsivtsivadze, E.; Borgdorff, H.; van de Wijgert, J.; Schuren, F.; Verhelst, R.; Heskes, T. Neighborhood co-regularized multi-view spectral clustering of microbiome data. In Proceedings of the IAPR, Kyoto, Japan, 20–23 May 2013; pp. 80–90. [Google Scholar]
- Chao, G.; Sun, S.; Bi, J. A survey on multi-view clustering. arXiv 2017, arXiv:1712.06246. [Google Scholar]
- Xu, Y.M.; Wang, C.D.; Lai, J.H. Weighted multi-view clustering with feature selection. Pattern Recognit. 2016, 53, 25–35. [Google Scholar] [CrossRef]
- Lu, C.; Yan, S.; Lin, Z. Convex sparse spectral clustering: Single-view to multi-view. IEEE Trans. Image Process. 2016, 25, 2833–2843. [Google Scholar] [CrossRef]
- Zhao, X.; Evans, N.; Dugelay, J.L. A subspace co-training framework for multi-view clustering. Pattern Recognit. Lett. 2014, 41, 73–82. [Google Scholar] [CrossRef]
- Ye, Y.; Liu, X.; Yin, J.; Zhu, E. Co-regularized kernel k-means for multi-view clustering. In Proceedings of the IEEE ICPR, Cancun, Mexico, 4–8 December 2016; pp. 1583–1588. [Google Scholar]
- Tan, J.; Yang, Z.; Cheng, Y.; Ye, J.; Dai, Q. SRAGL-AWCL: A Two-step Multi-view Clustering via Sparse Representation and Adaptive Weighted Cooperative Learning. Pattern Recognit. 2021, 117, 107987. [Google Scholar] [CrossRef]
- Li, J.; Zhao, H.; Tao, Z.; Fu, Y. Large-scale Subspace Clustering by Fast Regression Coding. In Proceedings of the IJCAI, Melbourne, Australia, 19–25 August 2017; pp. 2138–2144. [Google Scholar]
- Cao, X.; Zhang, C.; Fu, H.; Liu, S.; Zhang, H. Diversity-induced multi-view subspace clustering. In Proceedings of the IEEE CVPR, Boston, MA, USA, 7–12 June 2015; pp. 586–594. [Google Scholar]
- Zhao, J.; Lu, G. Clean affinity matrix learning with rank equality constraint for multi-view subspace clustering. Pattern Recognit. 2023, 134, 109118. [Google Scholar] [CrossRef]
- Yin, Q.; Wu, S.; He, R.; Wang, L. Multi-view clustering via pairwise sparse subspace representation. Neurocomputing 2015, 156, 12–21. [Google Scholar] [CrossRef]
- Liu, J.; Wang, C.; Gao, J.; Han, J. Multi-view clustering via joint nonnegative matrix factorization. In Proceedings of the IEEE ICDM, Dallas, TX, USA, 7–10 December 2013; pp. 252–260. [Google Scholar]
- Zong, L.; Zhang, X.; Zhao, L.; Yu, H.; Zhao, Q. Multi-view clustering via multi-manifold regularized non-negative matrix factorization. Neural Netw. 2017, 88, 74–89. [Google Scholar] [CrossRef]
- Yang, W.; Wang, Y.; Tang, C.; Tong, H.; Wei, A.; Wu, X. One step multi-view spectral clustering via joint adaptive graph learning and matrix factorization. Neurocomputing 2023, 524, 95–105. [Google Scholar] [CrossRef]
- Akata, Z.; Thurau, C.; Bauckhage, C. Non-negative matrix factorization in multimodality data for segmentation and label prediction. In Proceedings of the Computer Vision Winter Workshop, Colorado Springs, CO, USA, 20–25 June 2011. [Google Scholar]
- Yu, S.; Falck, T.; Daemen, A.; Tranchevent, L.C.; Suykens, J.A.; De Moor, B.; Moreau, Y. L2-norm multiple kernel learning and its application to biomedical data fusion. BMC Bioinform. 2010, 11, 309. [Google Scholar] [CrossRef]
- Kumar, A.; Rai, P.; Daume, H. Co-regularized multi-view spectral clustering. In Proceedings of the NIPS, Granada, Spain, 12–14 December 2011; pp. 1413–1421. [Google Scholar]
- Li, X.; Ren, Z.; Sun, Q.; Xu, Z. Auto-weighted Tensor Schatten p-Norm for Robust Multi-view Graph Clustering. Pattern Recognit. 2023, 134, 109083. [Google Scholar] [CrossRef]
- Duan, Y.; Wu, D.; Wang, R.; Li, X.; Nie, F. Scalable and parameter-free fusion graph learning for multi-view clustering. Neurocomputing 2024, 597, 128037. [Google Scholar] [CrossRef]
- Cai, D.; He, X.; Han, J. Document clustering using locality preserving indexing. IEEE Trans. Knowl. Data Eng. 2005, 17, 1624–1637. [Google Scholar] [CrossRef]
- Tang, W.; Lu, Z.; Dhillon, I.S. Clustering with multiple graphs. In Proceedings of the IEEE ICDM, Miami, FL, USA, 6–9 December 2009; pp. 1016–1021. [Google Scholar]
- Nie, F.; Li, J.; Li, X. Parameter-Free Auto-Weighted Multiple Graph Learning: A Framework for Multiview Clustering and Semi-Supervised Classification. In Proceedings of the IJCAI, New York, NY, USA, 9–15 July 2016; pp. 1881–1887. [Google Scholar]
- Nie, F.; Cai, G.; Li, X. Multi-View Clustering and Semi-Supervised Classification with Adaptive Neighbours. In Proceedings of the AAAI, San Francisco, CA, USA, 4–9 February 2017; pp. 2408–2414. [Google Scholar]
- Yang, M.S.; Hussain, I. Unsupervised multi-view K-means clustering algorithm. IEEE Access 2023, 11, 13574–13593. [Google Scholar] [CrossRef]
- Hussain, I.; Sinaga, K.P.; Yang, M.S. Unsupervised multiview fuzzy c-means clustering algorithm. Electronics 2023, 12, 4467. [Google Scholar] [CrossRef]
- Hussain, I.; Nataliani, Y.; Ali, M.; Hussain, A.; Mujlid, H.M.; Almaliki, F.A.; Rahimi, N.M. Weighted Multiview K-Means Clustering with L2 Regularization. Symmetry 2024, 16, 1646. [Google Scholar] [CrossRef]
- Xia, R.; Pan, Y.; Du, L.; Yin, J. Robust Multi-View Spectral Clustering via Low-Rank and Sparse Decomposition. In Proceedings of the AAAI, Québec City, QC, Canada, 27–31 July 2014; pp. 2149–2155. [Google Scholar]
- Zhang, C.; Chen, L.; Shi, Z.; Ding, W. Latent information-guided one-step multi-view fuzzy clustering based on cross-view anchor graph. Inf. Fusion 2024, 102, 102025. [Google Scholar] [CrossRef]
- Wang, X.; Qian, B.; Ye, J.; Davidson, I. Multi-objective multi-view spectral clustering via pareto optimization. In Proceedings of the IEEE ICDM, Dallas, TX, USA, 7–10 December 2013; pp. 234–242. [Google Scholar]
- Li, X.; Chen, M.; Nie, F.; Wang, Q. A Multiview-Based Parameter Free Framework for Group Detection. In Proceedings of the AAAI, San Francisco, CA, USA, 4–9 February 2017; pp. 4147–4153. [Google Scholar]
- Nie, F.; Li, J.; Li, X. Self-weighted Multiview Clustering with Multiple Graphs. In Proceedings of the IJCAI, Melbourne, Australia, 19–25 August 2017; pp. 2564–2570. [Google Scholar]
- Nie, F.; Wang, X.; Jordan, M.I.; Huang, H. The Constrained Laplacian Rank Algorithm for Graph-Based Clustering. In Proceedings of the AAAI, Phoenix, AZ, USA, 12–17 February 2016; pp. 1969–1976. [Google Scholar]
- Nie, F.; Wang, X.; Huang, H. Clustering and projected clustering with adaptive neighbors. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 24–27 August 2014; pp. 977–986. [Google Scholar]
- Fan, K. On a theorem of Weyl concerning eigenvalues of linear transformations II. Proc. Natl. Acad. Sci. USA 1950, 36, 31–35. [Google Scholar] [CrossRef] [PubMed]
- Wang, X.; Guo, X.; Lei, Z.; Zhang, C.; Li, S.Z. Exclusivity-consistency regularized multi-view subspace clustering. In Proceedings of the IEEE CVPR, Honolulu, HI, USA, 21–26 July 2017; pp. 923–931. [Google Scholar]
- Estévez, P.A.; Tesmer, M.; Perez, C.A.; Zurada, J.M. Normalized mutual information feature selection. IEEE Trans. Neural Netw. 2009, 20, 189–201. [Google Scholar] [CrossRef]
- Varshavsky, R.; Linial, M.; Horn, D. Compact: A comparative package for clustering assessment. In Proceedings of the ISPA, Nanjing, China, 2–5 November 2005; pp. 159–167. [Google Scholar]
- Wu, J.; Rehg, J.M. Where am I: Place instance and category recognition using spatial PACT. In Proceedings of the IEEE CVPR, Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
- Dalal, N.; Triggs, B. Histograms of oriented gradients for human detection. In Proceedings of the IEEE CVPR, San Diego, CA, USA, 20–26 June 2005; Volume 1, pp. 886–893. [Google Scholar]
- Oliva, A.; Torralba, A. Modeling the shape of the scene: A holistic representation of the spatial envelope. Int. J. Comput. Vis. 2001, 42, 145–175. [Google Scholar] [CrossRef]
- Ojala, T.; Pietikainen, M.; Maenpaa, T. Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 971–987. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Le, Q.; Mikolov, T. Distributed Representations of Sentences and Documents. arXiv 2014, arXiv:1405.4053v2. [Google Scholar] [CrossRef]
- Jones, K.S. A statistical interpretation of term specificity and its application in retrieval. J. Doc. 2004, 60, 493–502. [Google Scholar] [CrossRef]
- Deerwester, S.; Dumais, S.T.; Furnas, G.W.; Landauer, T.K. Indexing by Latent Semantic Analysis. J. Am. Soc. Inf. Sci. 1990, 41, 391–407. [Google Scholar] [CrossRef]
- Winn, J.; Jojic, N. Locus: Learning object classes with unsupervised segmentation. In Proceedings of the IEEE ICCV, Beijing, China, 17–20 October 2005; Volume 1, pp. 756–763. [Google Scholar]
- Fei-Fei, L.; Fergus, R.; Perona, P. Learning generative visual models from few training examples: An incremental bayesian approach tested on 101 object categories. Comput. Vis. Image Underst. 2007, 106, 59–70. [Google Scholar] [CrossRef]
- van Breukelen, M.; Duin, R.P.; Tax, D.M.; Den Hartog, J. Handwritten digit recognition by combined classifiers. Kybernetika 1998, 34, 381–386. [Google Scholar]
- Cai, X.; Wang, H.; Huang, H.; Ding, C. Joint stage recognition and anatomical annotation of drosophila gene expression patterns. Bioinformatics 2012, 28, i16–i24. [Google Scholar] [CrossRef] [PubMed]
- Greene, D.; Cunningham, P. Practical Solutions to the Problem of Diagonal Dominance in Kernel Document Clustering. In Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh, PA, USA, 25–29 June 2006. [Google Scholar]
Methods | Robust Similarity Matrix | Global Structure | Local Structure | Weighted Fusion |
---|---|---|---|---|
Co-training [22] | No | No | Yes | No |
Co-regularized [26] | No | No | Yes | No |
LMF [27] | No | Yes | No | No |
RMSC [33] | Yes | Yes | No | No |
AMGL [28] | No | Yes | No | Yes |
MLAN [29] | No | Yes | No | Yes |
MPF [36] | No | Yes | Yes | Yes |
PwMC [37] | Yes | Yes | No | No |
ES-MVC | Yes | Yes | Yes | Yes |
Methods | ACC | NMI | Purity |
---|---|---|---|
SC1 | 0.3244 ± 0.04 | 0.2740 ± 0.03 | 0.3878 ± 0.05 |
SC2 | 0.5608 ± 0.05 | 0.5077 ± 0.04 | 0.5995 ± 0.06 |
SC3 | 0.6311 ± 0.08 | 0.5954 ± 0.07 | 0.6949 ± 0.05 |
SC4 | 0.4699 ± 0.07 | 0.4225 ± 0.04 | 0.5196 ± 0.03 |
SC5 | 0.5611 ± 0.03 | 0.4747 ± 0.02 | 0.5914 ± 0.05 |
ConcatSC | 0.6769 ± 0.05 | 0.6539 ± 0.05 | 0.7219 ± 0.08 |
ESMSC | 0.6953 ± 0.02 | 0.6867 ± 0.01 | 0.7575 ± 0.03 |
RMSC | 0.6751 ± 0.06 | 0.6187 ± 0.05 | 0.7147 ± 0.07 |
AMGL | 0.6875 ± 0.06 | 0.6587 ± 0.08 | 0.7212 ± 0.05 |
MLAN | 0.6811 ± 0.02 | 0.6297 ± 0.02 | 0.7331 ± 0.02 |
OMVFC-LICAG | 0.7284 ± 0.03 | 0.5907 ± 0.02 | 0.7384 ± 0.03 |
Ours | 0.7643 ± 0.08 | 0.7275 ± 0.06 | 0.8125 ± 0.04 |
Methods | ACC | NMI | Purity |
---|---|---|---|
SC1 | 0.4821 ± 0.04 | 0.3660 ± 0.04 | 0.5577 ± 0.04 |
SC2 | 0.5601 ± 0.04 | 0.4378 ± 0.05 | 0.6347 ± 0.04 |
SC3 | 0.4611 ± 0.03 | 0.2755 ± 0.03 | 0.4915 ± 0.03 |
ConcatSC | 0.5357 ± 0.03 | 0.4525 ± 0.04 | 0.6178 ± 0.05 |
ESMSC | 0.5551 ± 0.03 | 0.4687 ± 0.03 | 0.6377 ± 0.05 |
RMSC | 0.5005 ± 0.04 | 0.3416 ± 0.02 | 0.5544 ± 0.05 |
AMGL | 0.4615 ± 0.05 | 0.3274 ± 0.06 | 0.5255 ± 0.02 |
MLAN | 0.5531 ± 0.02 | 0.4787 ± 0.02 | 0.6487 ± 0.01 |
OMVFC-LICAG | 0.4512 ± 0.02 | 0.3612 ± 0.04 | 0.5283 ± 0.06 |
Ours | 0.6097 ± 0.04 | 0.5132 ± 0.03 | 0.6902 ± 0.03 |
Methods | ACC | NMI | Purity |
---|---|---|---|
SC1 | 0.6315 ± 0.07 | 0.6626 ± 0.04 | 0.6815 ± 0.06 |
SC2 | 0.6152 ± 0.10 | 0.6847 ± 0.05 | 0.6824 ± 0.08 |
SC3 | 0.7505 ± 0.08 | 0.8180 ± 0.06 | 0.8131 ± 0.09 |
SC4 | 0.8495 ± 0.05 | 0.8567 ± 0.08 | 0.8725 ± 0.08 |
SC5 | 0.6109 ± 0.07 | 0.6068 ± 0.05 | 0.6601 ± 0.05 |
SC6 | 0.4049 ± 0.04 | 0.4771 ± 0.05 | 0.4541 ± 0.08 |
ConcatSC | 0.827 ± 0.15 | 0.8690 ± 0.07 | 0.8649 ± 0.09 |
ESMSC | 0.7399 ± 0.01 | 0.7968 ± 0.02 | 0.7769 ± 0.02 |
RMSC | 0.8625 ± 0.02 | 0.8105 ± 0.04 | 0.8715 ± 0.04 |
AMGL | 0.8336 ± 0.03 | 0.8568 ± 0.03 | 0.8405 ± 0.05 |
MLAN | 0.9731 ± 0.01 | 0.9387 ± 0.01 | 0.9731 ± 0.02 |
OMVFC-LICAG | 0.8372 ± 0.03 | 0.7981 ± 0.02 | 0.8381 ± 0.04 |
Ours | 0.9745 ± 0.04 | 0.9407 ± 0.04 | 0.9739 ± 0.02 |
Methods | ACC | NMI | Purity |
---|---|---|---|
SC1 | 0.4694 ± 0.02 | 0.2757 ± 0.02 | 0.4740 ± 0.02 |
SC2 | 0.3893 ± 0.02 | 0.2057 ± 0.09 | 0.3907 ± 0.02 |
ConcatSC | 0.4747 ± 0.08 | 0.2962 ± 0.03 | 0.4871 ± 0.02 |
ESMSC | 0.9191 ± 0.02 | 0.8535 ± 0.05 | 0.9192 ± 0.02 |
RMSC | 0.4585 ± 0.01 | 0.2672 ± 0.03 | 0.4584 ± 0.04 |
AMGL | 0.6032 ± 0.02 | 0.6102 ± 0.05 | 0.6855 ± 0.06 |
MLAN | 0.9501 ± 0.02 | 0.8874 ± 0.01 | 9.9505 ± 0.02 |
OMVFC-LICAG | 0.7754 ± 0.05 | 0.6941 ± 0.04 | 0.7752 ± 0.03 |
Ours | 0.9688 ± 0.02 | 0.9132 ± 0.03 | 0.9687 ± 0.03 |
Methods | ACC | NMI | Purity |
---|---|---|---|
SC1 | 0.4325 ± 0.02 | 0.1485 ± 0.04 | 0.4642 ± 0.03 |
SC2 | 0.5573 ± 0.01 | 0.3444 ± 0.05 | 0.6230 ± 0.09 |
SC3 | 0.7094 ± 0.03 | 0.6204 ± 0.08 | 0.7299 ± 0.05 |
ConcatSC | 0.7132 ± 0.07 | 0.6215 ± 0.03 | 0.7228 ± 0.05 |
ESMSC | 0.7375 ± 0.05 | 0.6420 ± 0.08 | 0.7387 ± 0.08 |
RMSC | 0.6788 ± 0.09 | 0.5672 ± 0.05 | 0.7275 ± 0.05 |
AMGL | 0.7165 ± 0.07 | 0.5825 ± 0.06 | 0.7460 ± 0.03 |
MLAN | 0.7365 ± 0.04 | 0.6277 ± 0.02 | 0.7555 ± 0.05 |
OMVFC-LICAG | 0.5172 ± 0.07 | 0.5590 ± 0.02 | 0.3101 ± 0.06 |
Ours | 0.7855 ± 0.09 | 0.6686 ± 0.07 | 0.7997 ± 0.08 |
Methods | ACC | NMI | Purity |
---|---|---|---|
SC1 | 0.3695 ± 0.02 | 0.0123 ± 0.02 | 0.3864 ± 0.05 |
SC2 | 0.3652 ± 0.02 | 0.0080 ± 0.03 | 0.3768 ± 0.04 |
SC3 | 0.3745 ± 0.05 | 0.0167 ± 0.06 | 0.3909 ± 0.06 |
SC4 | 0.3751 ± 0.06 | 0.0208 ± 0.03 | 0.3785 ± 0.08 |
SC5 | 0.3731 ± 0.02 | 0.0254 ± 0.02 | 0.3925 ± 0.05 |
SC6 | 0.3774 ± 0.03 | 0.0121 ± 0.03 | 0.3774 ± 0.05 |
ConcatSC | 0.3757 ± 0.06 | 0.0382 ± 0.04 | 0.3908 ± 0.07 |
ESMSC | 0.3973 ± 0.02 | 0.0381 ± 0.02 | 0.4013 ± 0.05 |
RMSC | 0.3755 ± 0.02 | 0.0373 ± 0.02 | 0.3961 ± 0.08 |
AMGL | 0.3843 ± 0.08 | 0.0199 ± 0.04 | 0.3852 ± 0.04 |
MLAN | 0.3852 ± 0.02 | 0.0211 ± 0.01 | 0.3877 ± 0.02 |
OMVFC-LICAG | 0.3011 ± 0.03 | 0.2552 ± 0.02 | 0.2984 ± 0.03 |
Ours | 0.4533 ± 0.02 | 0.0456 ± 0.05 | 0.4767 ± 0.04 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, D.; Wang, P.; Li, Q. Enhanced Similarity Matrix Learning for Multi-View Clustering. Electronics 2025, 14, 2845. https://doi.org/10.3390/electronics14142845
Zhang D, Wang P, Li Q. Enhanced Similarity Matrix Learning for Multi-View Clustering. Electronics. 2025; 14(14):2845. https://doi.org/10.3390/electronics14142845
Chicago/Turabian StyleZhang, Dongdong, Pusheng Wang, and Qin Li. 2025. "Enhanced Similarity Matrix Learning for Multi-View Clustering" Electronics 14, no. 14: 2845. https://doi.org/10.3390/electronics14142845
APA StyleZhang, D., Wang, P., & Li, Q. (2025). Enhanced Similarity Matrix Learning for Multi-View Clustering. Electronics, 14(14), 2845. https://doi.org/10.3390/electronics14142845