Please use this identifier to cite or link to this item:
https://open.uns.ac.rs/handle/123456789/31178
Title: | Comparison of different weighting schemes for the kNN classifier on time-series data | Authors: | Geler Zoltan Kurbalija Vladimir Radovanović Miloš Ivanović Mirjana |
Issue Date: | 2016 | Journal: | Knowledge and Information Systems | Abstract: | © 2015, Springer-Verlag London. Many well-known machine learning algorithms have been applied to the task of time-series classification, including decision trees, neural networks, support vector machines and others. However, it was shown that the simple 1-nearest neighbor (1NN) classifier, coupled with an elastic distance measure like Dynamic Time Warping (DTW), often produces better results than more complex classifiers on time-series data, including k-nearest neighbor (kNN) for values of k> 1. In this article, we revisit the kNN classifier on time-series data by considering ten classic distance-based vote weighting schemes in the context of Euclidean distance, as well as four commonly used elastic distance measures: DTW, Longest Common Subsequence, Edit Distance with Real Penalty and Edit Distance on Real sequence. Through experiments on the complete collection of UCR time-series datasets, we confirm the view that the 1NN classifier is very hard to beat. Overall, for all considered distance measures, we found that variants of the Dudani weighting scheme produced the best results. | URI: | https://open.uns.ac.rs/handle/123456789/31178 | ISSN: | 0219-1377 | DOI: | 10.1007/s10115-015-0881-0 |
Appears in Collections: | FF Publikacije/Publications PMF Publikacije/Publications |
Show full item record
SCOPUSTM
Citations
26
checked on Apr 29, 2023
Page view(s)
38
Last Week
9
9
Last month
2
2
checked on May 10, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.