Local Probabilistic-Manifold learning enhanced framework for incomplete multiview clustering

Authors
  • Qi Li
  • Hongyan Wu
Publication date 03-2026
Journal Knowledge-Based Systems
Article number 115444
Volume | Issue number 337
Number of pages 13
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Multiview clustering (MVC) has garnered extensive attention across diverse domains due to its ability to integrate complementary information from multiple perspectives. A primary challenge in Incomplete Multiview Clustering (IMVC) is to maintain the consistent knowledge shared across all views while maximally extracting unique, complementary information from each view. Additionally, addressing potential data loss or inconsistencies within individual views is crucial. However, tackling these issues through point-by-point data alignment across views can be prohibitively expensive. This study proposes a local probabilistic-manifold learning-enhanced approach for IMVC, where we integrate manifold structures rather than aligning raw data across views. First, local manifold learning captures intrinsic data structures within each view, and a mutual information maximization (MIM) module extracts distinctive, low-noise information. The learned probabilistic manifolds enhance the consistent information shared across views while mitigating the impact of point loss and noise in any single view. Second, to integrate partially consistent information, we use view-wise attention and consistent learning modules to align the manifolds across all views. This alignment bypasses the need for explicit space mapping and point-by-point data alignment, harmonizing the data and effectively addressing inconsistencies within individual views. Finally, the model undergoes fine-tuning, supervised by an indicator and target clustering distribution, to optimize the learned representations specifically for clustering tasks. Compared with eight state-of-the-art baselines, the proposed method significantly outperforms the best baseline in terms of accuracy, normalized mutual information (NMI), adjusted rand index (ARI), and F1-score by 3.23%, 3.65%, 4.88%, and 7.95%, respectively, on average.
Document type Article
Language English
Published at https://doi.org/10.1016/j.knosys.2026.115444
Other links https://www.scopus.com/pages/publications/105029258440
Permalink to this page
Back