Below 2000 Hz we use the time arrivals of the sound at each ear to work out left right position. Above 2000 Hz we use the relative loudness of the sound (as the head blocks out frequencies above 6000 Hz very effectively (even for small angles off axis like 30 degrees). For the above reasons I believe phase is very important. If high frequencies are delayed by your typical Minimum phase filter or MQA then imaging won’t be as precise because location cues arrive later than they should.
Front, back and up down directionality is more complex. We use the floor reflections which cause comb filtering to work out height. We also use the phase distortion caused by our pinea to work out front and back and to a less extent up down.
Anyway, like a dog, we will obviously tilt our head or move side to side to better deploy our location capabilities especially as high frequencies are so heavily attenuated or blocked by our head.
I would say we can detect the direction of a sound to within two or three inches from 20 feet away given enough sonic info (won’t work for a 100 Hz tone where directionality is challenged)