Real-Time 3-D Mapping with Estimating Acoustic Materials

Cited 1 time in webofscience Cited 1 time in scopus
  • Hit : 249
  • Download : 0
This paper proposes a real-time system integrating an acoustic material estimation from visual appearance and an on-the-fly mapping in the 3-dimension. The proposed method estimates the acoustic materials of surroundings in indoor scenes and incorporates them to a 3-D occupancy map, as a robot moves around the environment. To estimate the acoustic material from the visual cue, we apply the state-of-the-art semantic segmentation CNN network based on the assumption that the visual appearance and the acoustic materials have a strong association. Furthermore, we introduce an update policy to handle the material estimations during the online mapping process. As a result, our environment map with acoustic material can be used for sound-related robotics applications, such as sound source localization taking into account various acoustic propagation (e.g., reflection).
Publisher
IEEE/SICE
Issue Date
2020-01-14
Language
English
Citation

2020 IEEE/SICE International Symposium on System Integration, pp.646 - 651

ISSN
2474-2317
DOI
10.1109/SII46433.2020.9025860
URI
http://hdl.handle.net/10203/273535
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0