Learning Non-Metric Visual Similarity for Image Retrieval

Noa Garcia, George Vogiatzis

Research output: Contribution to journalArticlepeer-review


Measuring visual similarity between two or more instances within a data distribution is a fundamental task in image retrieval. Theoretically, non-metric distances are able to generate a more complex and accurate similarity model than metric distances, provided that the non-linear data distribution is precisely captured by the system. In this work, we explore neural networks models for learning a non-metric similarity function for instance search. We argue that non-metric similarity functions based on neural networks can build a better model of human visual perception than standard metric distances. As our proposed similarity function is differentiable, we explore a real end-to-end trainable approach for image retrieval, i.e. we learn the weights from the input image pixels to the final similarity score. Experimental evaluation shows that non-metric similarity networks are able to learn visual similarities between images and improve performance on top of state-of-the-art image representations, boosting results in standard image retrieval datasets with respect standard metric distances.
Original languageEnglish
Pages (from-to)18-25
Number of pages8
JournalImage and Vision Computing
Publication statusPublished - 10 Feb 2019

Bibliographical note

© 2019, Elsevier. Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/


  • Image retrieval
  • Non-metric learning
  • Visual similarity


Dive into the research topics of 'Learning Non-Metric Visual Similarity for Image Retrieval'. Together they form a unique fingerprint.

Cite this