Please use this identifier to cite or link to this item: https://open.uns.ac.rs/handle/123456789/15048
DC FieldValueLanguage
dc.contributor.authorĆulibrk, Dubravkoen
dc.contributor.authorSocek D.en
dc.contributor.authorMarques O.en
dc.contributor.authorFurht B.en
dc.date.accessioned2020-03-03T14:58:21Z-
dc.date.available2020-03-03T14:58:21Z-
dc.date.issued2007-12-01en
dc.identifier.urihttps://open.uns.ac.rs/handle/123456789/15048-
dc.description.abstractBackground modelling Neural Networks (BNN5) represent an approach to motion based object segmentation in video sequences. BNNs are probabilistic classifiers with nonparametric, kernel-based estimation of the underlying probability density functions. The paper presents an enhancement of the methodology, introducing automatic estimation and adaptation of the kernel width. The proposed enhancement eliminates the need to determine kernel width empirically. The selection of a kernel-width appropriate for the features used for segmentation is critical to achieving good segmentation results. The improvement makes the methodology easier to use and more adaptive, and facilitates the evaluation of the approach.en
dc.relation.ispartofVISAPP 2007 - 2nd International Conference on Computer Vision Theory and Applications, Proceedingsen
dc.titleAutomatic kernel width selection for neural network based video object segmentationen
dc.typeConference Paperen
dc.identifier.scopus2-s2.0-67650261092en
dc.identifier.urlhttps://api.elsevier.com/content/abstract/scopus_id/67650261092en
dc.relation.lastpage479en
dc.relation.firstpage472en
dc.relation.issueMTSV/-en
dc.relation.volumeIUen
item.grantfulltextnone-
item.fulltextNo Fulltext-
crisitem.author.deptFakultet tehničkih nauka, Departman za industrijsko inženjerstvo i menadžment-
crisitem.author.parentorgFakultet tehničkih nauka-
Appears in Collections:FTN Publikacije/Publications
Show simple item record

Page view(s)

45
Last Week
12
Last month
5
checked on May 10, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.