Probabilistic neural networks supporting multi-class relevance feedback in region-based image retrieval

Byoung Chul Ko, Hyeran Byun

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

The relevance feedback approach is one of widely used image retrieval technique. The main role of relevance feedback is to make the links between human's high level concept and computer's low level feature. Even though, there are several relevance feedback algorithms, some algorithms use ad-hoc heuristics or assume that feature vectors are independent regardless of their correlation. In this paper, we propose a new relevance feedback algorithm using Probabilistic Neural Networks (PNN) supporting multi-class learning. In our approach, there is no need to assume that feature vectors are independent and it permits system to insert additional classes for detail classification. In addition, it does not take long computation time for training, because it has only four layers. In PNN classification process, we keep user's entire past feedback actions as history in order to improve performance for future iterations. By history, our approach can capture the user's subjective intension more precisely and prevent retrieval performance from fluctuating or degrading in the next iteration. To validate the effectiveness of our feedback approach, we incorporate the proposed algorithm to our region-based image retrieval tool, FRIP (Finding Region In the Pictures). The efficacy of our method is validated using a set of 3000 images from Corel-photo CD.

Original languageEnglish
Pages (from-to)138-141
Number of pages4
JournalProceedings - International Conference on Pattern Recognition
Volume16
Issue number4
Publication statusPublished - 2002 Dec 1

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Probabilistic neural networks supporting multi-class relevance feedback in region-based image retrieval'. Together they form a unique fingerprint.

  • Cite this