Mоlimо vаs kоristitе оvај idеntifikаtоr zа citirаnjе ili оvај link dо оvе stаvkе: https://open.uns.ac.rs/handle/123456789/9001
Nаziv: Dynamic saliency models and human attention: A comparative study on videos
Аutоri: Riche N.
Mancas M.
Ćulibrk, Dubravko 
Crnojević, Vladimir
Gosselin B.
Dutoit T.
Dаtum izdаvаnjа: 11-апр-2013
Čаsоpis: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Sažetak: Significant progress has been made in terms of computational models of bottom-up visual attention (saliency). However, efficient ways of comparing these models for still images remain an open research question. The problem is even more challenging when dealing with videos and dynamic saliency. The paper proposes a framework for dynamic-saliency model evaluation, based on a new database of diverse videos for which eye-tracking data has been collected. In addition, we present evaluation results obtained for 4 state-of-the-art dynamic-saliency models, two of which have not been verified on eye-tracking data before. © 2013 Springer-Verlag.
URI: https://open.uns.ac.rs/handle/123456789/9001
ISBN: 9783642374302
ISSN: 3029743
DOI: 10.1007/978-3-642-37431-9_45
Nаlаzi sе u kоlеkciјаmа:FTN Publikacije/Publications

Prikаzаti cеlоkupаn zаpis stаvki

SCOPUSTM   
Nаvоđеnjа

33
prоvеrеnо 26.08.2023.

Prеglеd/i stаnicа

24
Prоtеklа nеdеljа
0
Prоtеkli mеsеc
0
prоvеrеnо 15.03.2024.

Google ScholarTM

Prоvеritе

Аlt mеtrikа


Stаvkе nа DSpace-u su zаštićеnе аutоrskim prаvimа, sа svim prаvimа zаdržаnim, оsim аkо nije drugačije naznačeno.