Auditory sensors are used less often than visual sensors in robotics and digital devices. However, these sensors are more capable in that they can observe stimulus coming from every direction, whereas visual sensors only have a line of sight. Sound Localization, a popular use of sound sensors, is the determination of the source of a particular sound. The purpose of this project was to create a microphone array system capable of real-time localization. The programmatic method utilized for localization was time difference of arrival through analyzing phase shifts. While this method was fragile, it was the most practical for real-time applications due to its simplicity and speed of calculations. In order to make the system more robust, a particle filter was implemented so as to reduce the effect of random errors. The microphone array itself involved a pair of in-line microphone arrays contained within PlayStation Eye cameras, with the arrays perpendicular to one another so as to maximize the sound differences for each axis. Upon testing of this apparatus, an average error of 4.3% for each coordinate was obtained. While this was not the level of accuracy desired, if the device was used for close-range real-time localization of a human speaking, it would correctly track them. In order to improve this implementation in the future, a more spread out, geometric array would provide more accurate phase shift differences than the tightly spaced microphones within the Eye.
Acoustical Society of America: Honorable Mention