Entropic Laplacian eigenmaps for unsupervised metric learning

  • Alexandre Luís Magalhães Levada UFSCar
  • Michel Ferreira Cardia Haddad University of Cambridge / Queen Mary University of London

Resumo


Unsupervised metric learning is concerned with building adaptive distance functions prior to pattern classification. Laplacian eigenmaps consists of a manifold learning algorithm which uses dimensionality reduction to find more compact and meaningful representations of datasets through the Laplacian matrix of graphs. In the present paper, we propose the entropic Laplacian eigenmaps (ELAP) algorithm, a parametric approach that employs the Kullback–Leibler (KL) divergence between patches of the KNN graph instead of the pointwise Euclidean metric as the cost function for the graph weights. Our objective with such a modification is increasing the robustness of Laplacian eigenmaps against noise and outliers. Our results using various real-world datasets indicate that the proposed method is capable of generating more reasonable clusters while reporting greater classification accuracies compared to existing widely adopted methods for dimensionality reduction-based metric learning.
Palavras-chave: Measurement, Graphics, Laplace equations, Buildings, Pattern classification, Euclidean distance, Cost function, Unsupervised metric learning, dimensionality reduction, Laplacian Eigenmaps, KL divergence, manifold learning
Publicado
18/10/2021
Como Citar

Selecione um Formato
LEVADA, Alexandre Luís Magalhães; HADDAD, Michel Ferreira Cardia. Entropic Laplacian eigenmaps for unsupervised metric learning. In: CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 34. , 2021, Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2021 .