Tue, 24 Jan 2006 16:35:51 -0600
I'm not sure if this is the right place to ask this but I can't seem to find
any group thats more appropriate.
I'm wondering what mechanism lets us perceive sound as being in front of us
or behind us. I would say its the ears as I can justify that without the
ears one could not determine the front from back(or actually anything such
that there are equal distances to both ears).
HRTF - head response transfer function.
When we move our heads or sound sources move, their timing and timbre
changes. One ear receives a sound at a different time when its not straight
ahead of us. At frequencies where we can't perceive small timing changes,
there are changed in timbre due to the acoustical effects of our heads and
But, HRTFs don't just work for moving objects.
We have learned what some sound sources sound like in general when they
approach us from vrious directions.
Or we learn how location and sound source sound quality change specficially,
say by means of sight, when the particular sound source moves. We can then
use this knowlege to guess the direction of the sound source and/or track
I figure that it has something to with reverbation and some other things but
I'm not sure. Anyone have a clue or know where I can get some more
information on this subject?
It's due to a combination of effects.
It's partially due to our ears' sensitivities to differences in the
amplitude, timing, and phase of sound arriving at the two ears.
It's partly due to the "head-related transfer function" - the change
in our ears' frequency responses to sounds arriving from different
angles, due to shadowing of higher frequencies by the head and the
pinna (outer portion) of the ears, and due to frequency-selective
reinforcement and cancellation of sound by the ridges and valleys in
the pinnae. Without external ears, the pinna shadowing and
comb-filtering wouldn't exist, but there would still be some
side-to-side shadowing done by the head itself.
It's partially due to real-time three-dimensional head tracking. Even
a slight rotation or tilting of the head will change the positions of
the ears with respect to the sound source. This will change the
inter-ear timing/phase of a sound in front of us in one way, and will
have an opposite effect on the inter-ear timing of a sound arriving
from the rear. The brain says "Hmmm, I rotated head clockwise, sound
phase in right ear moved forwards relative to sound phase in left ear,
sound source must be behind me." Since we rarely hold our heads
entirely still, we have this sort of additional data available to our
brains on a fairly constant basis.
Blauert's text "Spatial Hearing" was the best overall text on this
whole subject when I studied it back about 15 years ago.
Software or hardware simulation can do a fair job in taking a mono or
stereo sound source, and manipulating it so that it seems to "move
around". The trick is somewhat easier with headphones, harder with
speakers. Doing a really good job required monitoring the position of
the listener's head, so that the effects of head rotation can be
to see one approach for doing this sort of simulation. There are a
bunch of references there to research papers which discuss the
fundamentals of sound localization.