Optimizing 1-Nearest Prototype Classifiers

Paul Wohlhart, Martin Kostinger, Michael Donoser, Peter M. Roth, Horst Bischof; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2013, pp. 460-467


The development of complex, powerful classifiers and their constant improvement have contributed much to the progress in many fields of computer vision. However, the trend towards large scale datasets revived the interest in simpler classifiers to reduce runtime. Simple nearest neighbor classifiers have several beneficial properties, such as low complexity and inherent multi-class handling, however, they have a runtime linear in the size of the database. Recent related work represents data samples by assigning them to a set of prototypes that partition the input feature space and afterwards applies linear classifiers on top of this representation to approximate decision boundaries locally linear. In this paper, we go a step beyond these approaches and purely focus on 1-nearest prototype classification, where we propose a novel algorithm for deriving optimal prototypes in a discriminative manner from the training samples. Our method is implicitly multi-class capable, parameter free, avoids noise overfitting and, since during testing only comparisons to the derived prototypes are required, highly efficient. Experiments demonstrate that we are able to outperform related locally linear methods, while even getting close to the results of more complex classifiers.

Related Material

author = {Wohlhart, Paul and Kostinger, Martin and Donoser, Michael and Roth, Peter M. and Bischof, Horst},
title = {Optimizing 1-Nearest Prototype Classifiers},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2013}