Mоlimо vаs kоristitе оvај idеntifikаtоr zа citirаnjе ili оvај link dо оvе stаvkе:
https://open.uns.ac.rs/handle/123456789/1916
Nаziv: | Code clone benchmarks overview | Аutоri: | Vislavski, Tijana Rakić, Gordana |
Dаtum izdаvаnjа: | 1-јан-2018 | Čаsоpis: | CEUR Workshop Proceedings | Sažetak: | © 2018 by the paper's authors. Traditionally, when a new code clone detection tool is developed, few well-known and popular benchmarks are being used to evaluate the results that are achieved. These benchmarks have typically been created by cross-running several state-of-the-art clone detection tools, in order to overcome the bias of using just one tool, and combining their result sets in some fashion. These candidate clones, or more speciffically their subsets, have then been manually examined by clone experts or other participants, who would judge whether a candidate is a true clone or not. Many authors dealt with the problem of creating most objective benchmarks, how the candidate sets should be created, who should judge them, whether the judgment of these participants can be trusted or not. One of the main pitfalls, as with development of a clone detection tool, is the inherent lack of formal definitions and standards when it comes to clones and their classification. Recently, some new approaches were presented which do not depend on any clone tool, but utilize search heuristics in order to find speciffic functionalities, but these candidates are also manually examined by judges to classify them as true or false clones. This paper has a goal of examining state-of-the-art code clone benchmarks, as well as studies regarding clone judges reliability (and subsequently reliability of the benchmarks themselves) and their possible usage in a cross-language clone detection context. | URI: | https://open.uns.ac.rs/handle/123456789/1916 | ISBN: | 9788670314733 | ISSN: | 16130073 |
Nаlаzi sе u kоlеkciјаmа: | PMF Publikacije/Publications |
Prikаzаti cеlоkupаn zаpis stаvki
Prеglеd/i stаnicа
23
Prоtеklа nеdеljа
19
19
Prоtеkli mеsеc
0
0
prоvеrеnо 10.05.2024.
Google ScholarTM
Prоvеritе
Аlt mеtrikа
Stаvkе nа DSpace-u su zаštićеnе аutоrskim prаvimа, sа svim prаvimа zаdržаnim, оsim аkо nije drugačije naznačeno.