DASC: Robust Dense Descriptor for Multi-Modal and Multi-Spectral Correspondence Estimation

Cited 36 time in webofscience Cited 0 time in scopus
  • Hit : 7
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Seungryongko
dc.contributor.authorMin, Dongboko
dc.contributor.authorHam, Bumsubko
dc.contributor.authorDo, Minh N.ko
dc.contributor.authorSohn, Kwanghoonko
dc.date.accessioned2024-08-16T03:00:11Z-
dc.date.available2024-08-16T03:00:11Z-
dc.date.created2024-08-16-
dc.date.issued2017-09-
dc.identifier.citationIEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, v.39, no.9, pp.1712 - 1729-
dc.identifier.issn0162-8828-
dc.identifier.urihttp://hdl.handle.net/10203/322326-
dc.description.abstractEstablishing dense correspondences between multiple images is a fundamental task in many applications. However, finding a reliable correspondence between multi-modal or multi-spectral images still remains unsolved due to their challenging photometric and geometric variations. In this paper, we propose a novel dense descriptor, called dense adaptive self-correlation (DASC), to estimate dense multi-modal and multi-spectral correspondences. Based on an observation that self-similarity existing within images is robust to imaging modality variations, we define the descriptor with a series of an adaptive self-correlation similarity measure between patches sampled by a randomized receptive field pooling, in which a sampling pattern is obtained using a discriminative learning. The computational redundancy of dense descriptors is dramatically reduced by applying fast edge-aware filtering. Furthermore, in order to address geometric variations including scale and rotation, we propose a geometry-invariant DASC (GI-DASC) descriptor that effectively leverages the DASC through a superpixel-based representation. For a quantitative evaluation of the GI-DASC, we build a novel multi-modal benchmark as varying photometric and geometric conditions. Experimental results demonstrate the outstanding performance of the DASC and GI-DASC in many cases of dense multi-modal and multi-spectral correspondences.-
dc.languageEnglish-
dc.publisherIEEE COMPUTER SOC-
dc.titleDASC: Robust Dense Descriptor for Multi-Modal and Multi-Spectral Correspondence Estimation-
dc.typeArticle-
dc.identifier.wosid000406840800002-
dc.identifier.scopusid2-s2.0-85029365917-
dc.type.rimsART-
dc.citation.volume39-
dc.citation.issue9-
dc.citation.beginningpage1712-
dc.citation.endingpage1729-
dc.citation.publicationnameIEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE-
dc.identifier.doi10.1109/TPAMI.2016.2615619-
dc.contributor.localauthorKim, Seungryong-
dc.contributor.nonIdAuthorMin, Dongbo-
dc.contributor.nonIdAuthorHam, Bumsub-
dc.contributor.nonIdAuthorDo, Minh N.-
dc.contributor.nonIdAuthorSohn, Kwanghoon-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorDense correspondence-
dc.subject.keywordAuthordescriptor-
dc.subject.keywordAuthormulti-spectral-
dc.subject.keywordAuthormulti-modal-
dc.subject.keywordAuthoredge-aware filtering-
dc.subject.keywordPlusREGISTRATION-
dc.subject.keywordPlusIMAGES-
dc.subject.keywordPlusSIFT-
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 36 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0