Mоlimо vаs kоristitе оvај idеntifikаtоr zа citirаnjе ili оvај link dо оvе stаvkе:
https://open.uns.ac.rs/handle/123456789/31178
Nаziv: | Comparison of different weighting schemes for the kNN classifier on time-series data | Аutоri: | Geler Zoltan Kurbalija Vladimir Radovanović Miloš Ivanović Mirjana |
Dаtum izdаvаnjа: | 2016 | Čаsоpis: | Knowledge and Information Systems | Sažetak: | © 2015, Springer-Verlag London. Many well-known machine learning algorithms have been applied to the task of time-series classification, including decision trees, neural networks, support vector machines and others. However, it was shown that the simple 1-nearest neighbor (1NN) classifier, coupled with an elastic distance measure like Dynamic Time Warping (DTW), often produces better results than more complex classifiers on time-series data, including k-nearest neighbor (kNN) for values of k> 1. In this article, we revisit the kNN classifier on time-series data by considering ten classic distance-based vote weighting schemes in the context of Euclidean distance, as well as four commonly used elastic distance measures: DTW, Longest Common Subsequence, Edit Distance with Real Penalty and Edit Distance on Real sequence. Through experiments on the complete collection of UCR time-series datasets, we confirm the view that the 1NN classifier is very hard to beat. Overall, for all considered distance measures, we found that variants of the Dudani weighting scheme produced the best results. | URI: | https://open.uns.ac.rs/handle/123456789/31178 | ISSN: | 0219-1377 | DOI: | 10.1007/s10115-015-0881-0 |
Nаlаzi sе u kоlеkciјаmа: | FF Publikacije/Publications PMF Publikacije/Publications |
Prikаzаti cеlоkupаn zаpis stаvki
SCOPUSTM
Nаvоđеnjа
26
prоvеrеnо 29.04.2023.
Prеglеd/i stаnicа
38
Prоtеklа nеdеljа
9
9
Prоtеkli mеsеc
2
2
prоvеrеnо 10.05.2024.
Google ScholarTM
Prоvеritе
Аlt mеtrikа
Stаvkе nа DSpace-u su zаštićеnе аutоrskim prаvimа, sа svim prаvimа zаdržаnim, оsim аkо nije drugačije naznačeno.