Recognizing object of attention in virtual reality가상공간에서의 주의대상객체 인식기법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 448
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorLee, Sunghee-
dc.contributor.advisor이성희-
dc.contributor.authorChung, Choongho-
dc.date.accessioned2019-08-28T02:46:29Z-
dc.date.available2019-08-28T02:46:29Z-
dc.date.issued2018-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=828478&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/266041-
dc.description학위논문(석사) - 한국과학기술원 : 문화기술대학원, 2018.8,[iii, 21 p. :]-
dc.description.abstractRecent developments in Virtual Reality (VR) and Mixed Reality (MR) provide new opportunities to foster more immersive remote collaborations. In such collaboration scenarios, it is important to consider possible cues that help recognize an object of interest that users are focusing on. We propose a new data-driven approach to predict object of attention in virtual reality by making use of deictic marking information of the other user avatar as input features. Our application makes real time predictions on user attention and gaze point in virtual scenes. Our method marginally outperforms simple gaze prediction method. Further examination on object detection accuracy reveals our approach shows more robustness in dynamic scenarios for detecting object of interest validating deictic information's usefulness in gaze predictions.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectObject of attention▼adeictic information▼agaze▼aegocentric video▼aremote collaboration▼avirtual reality-
dc.subject주의대상객체▼a지시동작 정보▼a시선▼a일인칭 영상▼a원격 협업▼a가상현실-
dc.titleRecognizing object of attention in virtual reality-
dc.title.alternative가상공간에서의 주의대상객체 인식기법-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :문화기술대학원,-
dc.contributor.alternativeauthor정충호-
Appears in Collection
GCT-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0