The sense of hearing confers many important advantages, but few are as fundamental to survival or as universal across species as the capacity for auditory spatial awareness (ASA). Because sound alone provides information about objects and events throughout 360º of extrapersonal space, ASA is critical for awareness outside the visual field and for the experience of sensory “immersion” in both natural and virtual scenes. However, because space is not explicitly represented in the auditory periphery, it must be computed or infer space from multiple informative but imperfect acoustical features (“cues”). For real sounds, these cues are distributed across time and frequency, and often distorted in complex ways by echoes and reverberation. Nevertheless, young normal-hearing listeners are remarkably good at localizing sounds and understanding the auditory scene, even in acoustically complex environments.
In this talk, Stecker will discuss (1) how listeners weight and combine auditory spatial cues across cue type, time, and frequency; (2) how that ability relates to the consequences of reverberation, hearing loss, and hearing-aid technology on spatial hearing; and (3) what neuroimaging with fMRI can tell us about cortical mechanisms that process auditory spatial cues and represent the auditory scene. Along the way, Stecker will discuss applications to virtual and augmented reality, next-generation hearing aids, and potential avenues for spatial habilitation in diverse patient populations.