Spatiotemporal Saliency Detection Using Textural Contrast and Its Applications

Cited 54 time in webofscience Cited 68 time in scopus
  • Hit : 529
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Won-Junko
dc.contributor.authorKim, Changickko
dc.date.accessioned2014-09-01T07:18:42Z-
dc.date.available2014-09-01T07:18:42Z-
dc.date.created2014-06-03-
dc.date.created2014-06-03-
dc.date.issued2014-04-
dc.identifier.citationIEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, v.24, no.4, pp.646 - 659-
dc.identifier.issn1051-8215-
dc.identifier.urihttp://hdl.handle.net/10203/189205-
dc.description.abstractSaliency detection has been extensively studied due to its promising contributions for various computer vision applications. However, most existing methods are easily biased toward edges or corners, which are statistically significant, but not necessarily relevant. Moreover, they often fail to find salient regions in complex scenes due to ambiguities between salient regions and highly textured backgrounds. In this paper, we present a novel unified framework for spatiotemporal saliency detection based on textural contrast. Our method is simple and robust, yet biologically plausible; thus, it can be easily extended to various applications, such as image retargeting, object segmentation, and video surveillance. Based on various datasets, we conduct comparative evaluations of 12 representative saliency detection models presented in the literature, and the results show that the proposed scheme outperforms other previously developed methods in detecting salient regions of the static and dynamic scenes.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.subjectVISUAL-ATTENTION-
dc.subjectIMAGE SEGMENTATION-
dc.subjectOBJECT DETECTION-
dc.subjectREGION DETECTION-
dc.subjectDYNAMIC SCENES-
dc.subjectMODEL-
dc.subjectVIDEO-
dc.subjectSUBTRACTION-
dc.subjectTRACKING-
dc.titleSpatiotemporal Saliency Detection Using Textural Contrast and Its Applications-
dc.typeArticle-
dc.identifier.wosid000334522800008-
dc.identifier.scopusid2-s2.0-84897978322-
dc.type.rimsART-
dc.citation.volume24-
dc.citation.issue4-
dc.citation.beginningpage646-
dc.citation.endingpage659-
dc.citation.publicationnameIEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY-
dc.identifier.doi10.1109/TCSVT.2013.2290579-
dc.contributor.localauthorKim, Changick-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorComparative evaluations-
dc.subject.keywordAuthorcomputer vision applications-
dc.subject.keywordAuthorhuman visual attention-
dc.subject.keywordAuthorsaliency detection-
dc.subject.keywordAuthortextural contrast-
dc.subject.keywordPlusVISUAL-ATTENTION-
dc.subject.keywordPlusIMAGE SEGMENTATION-
dc.subject.keywordPlusOBJECT DETECTION-
dc.subject.keywordPlusREGION DETECTION-
dc.subject.keywordPlusDYNAMIC SCENES-
dc.subject.keywordPlusGRAPH CUTS-
dc.subject.keywordPlusMODEL-
dc.subject.keywordPlusVIDEO-
dc.subject.keywordPlusSUBTRACTION-
dc.subject.keywordPlusTRACKING-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 54 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0