Please use this identifier to cite or link to this item: https://open.uns.ac.rs/handle/123456789/13756
Title: Hubness-based fuzzy measures for high-dimensional k-nearest neighbor classification
Authors: Tomašev N.
Radovanović M.
Mladenić D.
Ivanović, Mirjana 
Issue Date: 7-Sep-2011
Journal: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract: High-dimensional data are by their very nature often difficult to handle by conventional machine-learning algorithms, which is usually characterized as an aspect of the curse of dimensionality. However, it was shown that some of the arising high-dimensional phenomena can be exploited to increase algorithm accuracy. One such phenomenon is hubness, which refers to the emergence of hubs in high-dimensional spaces, where hubs are influential points included in many k-neighbor sets of other points in the data. This phenomenon was previously used to devise a crisp weighted voting scheme for the k-nearest neighbor classifier. In this paper we go a step further by embracing the soft approach, and propose several fuzzy measures for k-nearest neighbor classification, all based on hubness, which express fuzziness of elements appearing in k-neighborhoods of other points. Experimental evaluation on real data from the UCI repository and the image domain suggests that the fuzzy approach provides a useful measure of confidence in the predicted labels, resulting in improvement over the crisp weighted method, as well the standard kNN classifier. © 2011 Springer-Verlag.
URI: https://open.uns.ac.rs/handle/123456789/13756
ISBN: 9783642231988
ISSN: 03029743
DOI: 10.1007/978-3-642-23199-5_2
Appears in Collections:PMF Publikacije/Publications

Show full item record

SCOPUSTM   
Citations

19
checked on May 10, 2024

Page view(s)

22
Last Week
8
Last month
0
checked on May 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.