A Kullback-Leibler Divergence-Based Locally Linear Embedding Method: A Novel Parametric Approach for Cluster Analysis

Resumo


Numerous problems in machine learning require some type of dimensionality reduction. Unsupervised metric learning deals with the definition of intrinsic and adaptive distance functions of a dataset. Locally linear embedding (LLE) consists of a widely used manifold learning algorithm that applies dimensionality reduction to find a more compact and meaningful representation of the observed data through the capture of the local geometry of the patches. In order to overcome relevant limitations of the LLE approach, we introduce the LLE Kullback-Leibler (LLE-KL) method. Our objective with such a methodological modification is to increase the robustness of the LLE to the presence of noise or outliers in the data. The proposed method employs the KL divergence between patches of the KNN graph instead of the pointwise Euclidean metric. Our empirical results using several real-world datasets indicate that the proposed method delivers a superior clustering allocation compared to state-of-the-art methods of dimensionality reduction-based metric learning.
Palavras-chave: Locally linear embedding, Metric learning, KL divergence
Publicado
29/11/2021
Como Citar

Selecione um Formato
LEVADA, Alexandre L. M.; HADDAD, Michel F. C.. A Kullback-Leibler Divergence-Based Locally Linear Embedding Method: A Novel Parametric Approach for Cluster Analysis. In: BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 10. , 2021, Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 . ISSN 2643-6264.