GPR1200: A Benchmark for General-Purpose Content-Based Image Retrieval

28th International Conference on Multimedia Modeling (MMM 2022)

TL;DR

We introduce GPR1200, a new benchmark dataset designed to evaluate the generalization quality of image retrieval models across diverse categories. Our experiments show that large-scale pretraining significantly enhances retrieval performance.

Abstract

Even though it has extensively been shown that retrieval specific training of deep neural networks is beneficial for nearest neighbor image search quality, most of these models are trained and tested in the domain of landmarks images. However, some applications use images from various other domains and therefore need a network with good generalization properties - a general-purpose CBIR model. To the best of our knowledge, no testing protocol has so far been introduced to benchmark models with respect to general image retrieval quality. After analyzing popular image retrieval test sets we decided to manually curate GPR1200, an easy to use and accessible but challenging benchmark dataset with a broad range of image categories. This benchmark is subsequently used to evaluate various pretrained models of different architectures on their generalization qualities. We show that large-scale pretraining significantly improves retrieval performance and present experiments on how to further increase these properties by appropriate fine-tuning. With these promising results, we hope to increase interest in the research topic of general-purpose CBIR.

BibTeX

If you use our work in your research, please cite our publication:

@InProceedings{10.1007/978-3-030-98358-1_17,
author="Schall, Konstantin
and Barthel, Kai Uwe
and Hezel, Nico
and Jung, Klaus",
editor="Þ{\'o}r J{\'o}nsson, Bj{\"o}rn
and Gurrin, Cathal
and Tran, Minh-Triet
and Dang-Nguyen, Duc-Tien
and Hu, Anita Min-Chun
and Huynh Thi Thanh, Binh
and Huet, Benoit",
title="GPR1200: A Benchmark for General-Purpose Content-Based Image Retrieval",
booktitle="MultiMedia Modeling",
year="2022",
publisher="Springer International Publishing",
address="Cham",
pages="205--216",
isbn="978-3-030-98358-1"
}