Physically-inspired Deep Light Estimation from a Homogeneous-Material Object for Mixed Reality Lighting

Cited 11 time in webofscience Cited 4 time in scopus
  • Hit : 871
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorPark, Jinwooko
dc.contributor.authorPark, Hunminko
dc.contributor.authorYoon, Sung-Euiko
dc.contributor.authorWoo, Woontackko
dc.date.accessioned2020-04-21T07:20:16Z-
dc.date.available2020-04-21T07:20:16Z-
dc.date.created2020-03-18-
dc.date.created2020-03-18-
dc.date.issued2020-03-
dc.identifier.citationIEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, v.26, no.5, pp.2002 - 2011-
dc.identifier.issn1077-2626-
dc.identifier.urihttp://hdl.handle.net/10203/273944-
dc.description.abstractIn mixed reality (MR), augmenting virtual objects consistently with real-world illumination is one of the key factors that provide a realistic and immersive user experience. For this purpose, we propose a novel deep learning-based method to estimate high dynamic range (HDR) illumination from a single RGB image of a reference object. To obtain illumination of a current scene, previous approaches inserted a special camera in that scene, which may interfere with user's immersion, or they analyzed reflected radiances from a passive light probe with a specific type of materials or a known shape. The proposed method does not require any additional gadgets or strong prior cues, and aims to predict illumination from a single image of an observed object with a wide range of homogeneous materials and shapes. To effectively solve this ill-posed inverse rendering problem, three sequential deep neural networks are employed based on a physically-inspired design. These networks perform end-to-end regression to gradually decrease dependency on the material and shape. To cover various conditions, the proposed networks are trained on a large synthetic dataset generated by physically-based rendering. Finally, the reconstructed HDR illumination enables realistic image-based lighting of virtual objects in MR. Experimental results demonstrate the effectiveness of this approach compared against state-of-the-art methods. The paper also suggests some interesting MR applications in indoor and outdoor scenes.-
dc.languageEnglish-
dc.publisherIEEE COMPUTER SOC-
dc.titlePhysically-inspired Deep Light Estimation from a Homogeneous-Material Object for Mixed Reality Lighting-
dc.typeArticle-
dc.identifier.wosid000523746000019-
dc.identifier.scopusid2-s2.0-85079681600-
dc.type.rimsART-
dc.citation.volume26-
dc.citation.issue5-
dc.citation.beginningpage2002-
dc.citation.endingpage2011-
dc.citation.publicationnameIEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS-
dc.identifier.doi10.1109/TVCG.2020.2973050-
dc.contributor.localauthorYoon, Sung-Eui-
dc.contributor.localauthorWoo, Woontack-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle; Proceedings Paper-
dc.subject.keywordAuthorLighting-
dc.subject.keywordAuthorShape-
dc.subject.keywordAuthorProbes-
dc.subject.keywordAuthorEstimation-
dc.subject.keywordAuthorVirtual reality-
dc.subject.keywordAuthorImage reconstruction-
dc.subject.keywordAuthorCameras-
dc.subject.keywordAuthorLight estimation-
dc.subject.keywordAuthorlight probe-
dc.subject.keywordAuthorphysically-based rendering-
dc.subject.keywordAuthordeep learning-
dc.subject.keywordAuthorcoherent rendering-
dc.subject.keywordAuthormixed reality-
dc.subject.keywordPlusAUGMENTED REALITY-
dc.subject.keywordPlusILLUMINATION-
dc.subject.keywordPlusREFLECTANCE-
dc.subject.keywordPlusCOLOR-
dc.subject.keywordPlusFACES-
Appears in Collection
CS-Journal Papers(저널논문)GCT-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 11 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0