S3: A Spectral-Spatial Structure Loss for Pan-Sharpening Networks

Cited 14 time in webofscience Cited 8 time in scopus
  • Hit : 460
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorChoi, Jae-Seokko
dc.contributor.authorKim, Yongwooko
dc.contributor.authorKim, Munchurlko
dc.date.accessioned2020-05-19T01:20:08Z-
dc.date.available2020-05-19T01:20:08Z-
dc.date.created2019-11-22-
dc.date.issued2020-05-
dc.identifier.citationIEEE GEOSCIENCE AND REMOTE SENSING LETTERS, v.17, no.5, pp.829 - 833-
dc.identifier.issn1545-598X-
dc.identifier.urihttp://hdl.handle.net/10203/274231-
dc.description.abstractRecently, many deep-learning-based pan-sharpening methods have been proposed for generating high-quality pan-sharpened (PS) satellite images. These methods focused on various types of convolutional neural network (CNN) structures, which were trained by simply minimizing a spectral loss between network outputs and the corresponding high-resolution (HR) multi-spectral (MS) target images. However, owing to different sensor characteristics and acquisition times, HR panchromatic (PAN) and low-resolution MS image pairs tend to have large pixel misalignments, especially for moving objects in the images. Conventional CNNs trained with only the spectral loss with these satellite image data sets often produce PS images of low visual quality including double-edge artifacts along strong edges and ghosting artifacts on moving objects. In this letter, we propose a novel loss function, called a spectral-spatial structure (S3) loss, based on the correlation maps between MS targets and PAN inputs. Our proposed S3 loss can be very effectively used for pan-sharpening with various types of CNN structures, resulting in significant visual improvements on PS images with suppressed artifacts.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleS3: A Spectral-Spatial Structure Loss for Pan-Sharpening Networks-
dc.typeArticle-
dc.identifier.wosid000529957500021-
dc.identifier.scopusid2-s2.0-85084149631-
dc.type.rimsART-
dc.citation.volume17-
dc.citation.issue5-
dc.citation.beginningpage829-
dc.citation.endingpage833-
dc.citation.publicationnameIEEE GEOSCIENCE AND REMOTE SENSING LETTERS-
dc.identifier.doi10.1109/LGRS.2019.2934493-
dc.contributor.localauthorKim, Munchurl-
dc.contributor.nonIdAuthorKim, Yongwoo-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorTraining-
dc.subject.keywordAuthorSatellites-
dc.subject.keywordAuthorCorrelation-
dc.subject.keywordAuthorSpatial resolution-
dc.subject.keywordAuthorVisualization-
dc.subject.keywordAuthorConvolutional neural networks-
dc.subject.keywordAuthorConvolutional neural network (CNN)-
dc.subject.keywordAuthordeep learning-
dc.subject.keywordAuthorpan colorization-
dc.subject.keywordAuthorpan-sharpening-
dc.subject.keywordAuthorsatellite imagery-
dc.subject.keywordAuthorspectral-spatial structure-
dc.subject.keywordAuthorsuper-resolution (SR)-
dc.subject.keywordPlusDECOMPOSITION-
dc.subject.keywordPlusIMAGES-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 14 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0