PD-Loss: Proxy-Decidability for Efficient Metric Learning
Resumo
Deep Metric Learning (DML) aims to learn embedding functions that map semantically similar inputs to proximate points in a metric space while separating dissimilar ones. Existing methods, such as pairwise losses, are hindered by complex sampling requirements and slow convergence. In contrast, proxy-based losses, despite their improved scalability, often fail to optimize global distribution properties. The Decidability-based Loss (D-Loss) addresses this by targeting the decidability index (d′) to enhance distribution separability, but its reliance on large mini-batches imposes significant computational constraints. We introduce Proxy-Decidability Loss (PD-Loss), a novel objective that integrates learnable proxies with the statistical framework of d′ to optimize embedding spaces efficiently. By estimating genuine and impostor distributions through proxies, PD-Loss combines the computational efficiency of proxy-based methods with the principled separability of D-Loss, offering a scalable approach to distribution-aware DML. Experiments across various tasks, including fine-grained classification and face verification, demonstrate that PD-Loss achieves performance comparable to that of state-of-the-art methods while introducing a new perspective on embedding optimization, with potential for broader applications.
Palavras-chave:
Graphics, Scalability, Extraterrestrial measurements, Computational efficiency, Indexes, Optimization, Faces, Convergence
Publicado
30/09/2025
Como Citar
SILVA, Pedro H. L.; SILVA, Guilherme A. L.; COELHO, Pablo; FREITAS, Vander; MOREIRA, Gladston; MENOTTI, David; LUZ, Eduardo.
PD-Loss: Proxy-Decidability for Efficient Metric Learning. In: CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 38. , 2025, Salvador/BA.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2025
.
p. 146-151.
