Please use this identifier to cite or link to this item: https://open.uns.ac.rs/handle/123456789/20623
Title: Weighted kNN and constrained elastic distances for time-series classification
Authors: Geler Zoltan 
Kurbalija Vladimir 
Ivanović Mirjana 
Radovanović Miloš 
Issue Date: 2020
Journal: Expert Systems with Applications
Abstract: © 2020 Elsevier Ltd Time-series classification has been addressed by a plethora of machine-learning techniques, including neural networks, support vector machines, Bayesian approaches, and others. It is an accepted fact, however, that the plain vanilla 1-nearest neighbor (1NN) classifier, combined with an elastic distance measure such as Dynamic Time Warping (DTW), is competitive and often superior to more complex classification methods, including the majority-voting k-nearest neighbor (kNN) classifier. With this paper we continue our investigation of the kNN classifier on time-series data and the impact of various classic distance-based vote weighting schemes by considering constrained versions of four common elastic distance measures: DTW, Longest Common Subsequence (LCS), Edit Distance with Real Penalty (ERP), and Edit Distance on Real sequence (EDR). By performing experiments on the entire UCR Time Series Classification Archive we show that weighted kNN is able to consistently outperform 1NN. Furthermore, we provide recommendations for the choices of the constraint width parameter r, neighborhood size k, and weighting scheme, for each mentioned elastic distance measure.
URI: https://open.uns.ac.rs/handle/123456789/20623
ISSN: 0957-4174
DOI: 10.1016/j.eswa.2020.113829
Appears in Collections:FF Publikacije/Publications
PMF Publikacije/Publications

Show full item record

SCOPUSTM   
Citations

43
checked on May 3, 2024

Page view(s)

28
Last Week
2
Last month
0
checked on May 3, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.