ARTIFICIAL EAR FOR ROBOTS

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 946
  • Download : 748
Robots with sound localization capability have been developed mostly relying on machine vision or an array of more than two microphones. Humans can localize a sound source hidden to the eye and distinguish whether a sound is coming from front or rear, above or below with only two ears. This is mainly due to the complex shape of the pinna. In particular, reflections off the posterior wall of the concha produce spectral notches at different frequencies in the headrelated transfer function (HRTF) as the sound source shifts in position. From the non-individualized HRTFs measured with a B&K HATS (head and torso simulator), we were able to confirm the relationship between the spectral notches and the geometry of the concha. Based on the observation of the HATS' pinna and resulting HRTFs, a novel design of artificial ear that can be mounted on a robot head is proposed in order to explore the possibility of developing sound localization sensors using only two microphones. Experimental results using a designed artificial ear show that the spectral notches are distinctively changed with respect to the elevation in the frontal region, whereas they disappear in the rear. In view of that result, it is expected to pinpoint a sound direction in 3-D space using only two microphones and the designed artificial ear can be a suitable mechanical sensor for sound source localization.
Description
기계공학전공
Issue Date
2006-10-22
Language
English
Citation

Proceedings of IEEE Sensors, pp.1460-1463

URI
http://hdl.handle.net/10203/3027
Appears in Collection
ME-Conference Papers(학술회의논문)
Files in This Item
IEEE Sensors 2006_Artificial Ear for Robots.pdf(4.2 MB)Download

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0