Music is processed in the amygdala. But if the brain detects something wrong (speed, distortion, tonal balance and the like) then there is a tipping point where the processing is transferred to the cerebral cortex. At this point a lot of the emotional impact is lost.
When you are auditioning cables or comparing two audio products, the cerebral cortex is where the music is being processed.
The goal of the designer is to keep the musical processing in the limibic system (amygdala) so as to have the most impact on the listener. To do this, the equipment has to be designed to honor and obey the human hearing rules.
For example, the ear uses the higher ordered harmonics to sense sound pressure and also assigns tonality to all harmonics in the same way as it does for musical instruments. So if an amplifier has higher ordered content that isn't masked somehow (the signal cannot do it) then the amp will sound harsh and bright (due to the ear's assignment of the harmonics) and the processing will likely not be in the limbic system.
There are a good number of similar examples- its like negotiating a minefield for success. But if the designer understands the hearing perceptual rules then they can be applied to the design (engineering).