IAMHear: A Tabletop Interface with Smart Mobile Devices using Acoustic Location

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 339
  • Download : 474
IAMHear is a novel tabletop interface for music performance and sound making, in which smart mobile devices are used as on-table objects for interaction. Thanks to the advanced features of smart mobile devices, IAMHear is by nature multi-modal and highly interactive. The system also allows for acoustic location mechanism using virtually inaudible sound without any special sensors, making itself simpler in structure and easier to implement. In addition, use of”everyday objects“also evokes interaction by intuitive gestures such as placement, movement, and rotation. As a music sequencer, IAMHear enables the user to make music by placing objects on table; inspired by the idea of spectrographic mapping with virtual scan line, pitch and timbre of sounds are determined by the location/orientation of tabletop objects as well as ambient noise. We present IAMHear as a simple and novel alternative to interactive tabletop interface for music and various multimedia applications as well.
Publisher
ACM Special Interest Group on Computer-Human Interaction (SIGCHI)
Issue Date
2013-04-29
Language
English
Citation

The ACM SIGCHI Conference on Human Factors in Computing Systems 2013, pp.1521 - 1526

DOI
10.1145/2468356.2468628
URI
http://hdl.handle.net/10203/188008
Appears in Collection
GCT-Conference Papers(학술회의논문)
Files in This Item

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0