Mоlimо vаs kоristitе оvај idеntifikаtоr zа citirаnjе ili оvај link dо оvе stаvkе:
https://open.uns.ac.rs/handle/123456789/13416
Nаziv: | Data-driven approach to dynamic visual attention modelling | Аutоri: | Ćulibrk, Dubravko Sladojević, Srđan Riche N. Mancas M. Crnojević V. |
Dаtum izdаvаnjа: | 12-јун-2012 | Čаsоpis: | Proceedings of SPIE - The International Society for Optical Engineering | Sažetak: | Visual attention deployment mechanisms allow the Human Visual System to cope with an overwhelming amount of visual data by dedicating most of the processing power to objects of interest. The ability to automatically detect areas of the visual scene that will be attended to by humans is of interest for a large number of applications, from video coding, video quality assessment to scene understanding. Due to this fact, visual saliency (bottom-up attention) models have generated significant scientific interest in recent years. Most recent work in this area deals with dynamic models of attention that deal with moving stimuli (videos) instead of traditionally used still images. Visual saliency models are usually evaluated against ground-truth eye-tracking data collected from human subjects. However, there are precious few recently published approaches that try to learn saliency from eye-tracking data and, to the best of our knowledge, no approaches that try to do so when dynamic saliency is concerned. The paper attempts to fill this gap and describes an approach to data-driven dynamic saliency model learning. A framework is proposed that enables the use of eye-tracking data to train an arbitrary machine learning algorithm, using arbitrary features derived from the scene. We evaluate the methodology using features from a state-of-the art dynamic saliency model and show how simple machine learning algorithms can be trained to distinguish between visually salient and non-salient parts of the scene. © 2012 SPIE. | URI: | https://open.uns.ac.rs/handle/123456789/13416 | ISBN: | 9780819491282 | ISSN: | 0277786X | DOI: | 10.1117/12.923559 |
Nаlаzi sе u kоlеkciјаmа: | FTN Publikacije/Publications |
Prikаzаti cеlоkupаn zаpis stаvki
SCOPUSTM
Nаvоđеnjа
1
prоvеrеnо 09.09.2023.
Prеglеd/i stаnicа
31
Prоtеklа nеdеljа
0
0
Prоtеkli mеsеc
0
0
prоvеrеnо 15.03.2024.
Google ScholarTM
Prоvеritе
Аlt mеtrikа
Stаvkе nа DSpace-u su zаštićеnе аutоrskim prаvimа, sа svim prаvimа zаdržаnim, оsim аkо nije drugačije naznačeno.