Amps all measure differently, except our measurements are actually poor even to this day of incredibly sensitive and accurate instrumentation. Why? It's really quite simple, if you know something about electronic instrumentation limitations.
What are the instruments we use to measure amps? Signal generators (sine and square wave), oscilloscopes for time domain, and precision audio band spectrum analyzers. A few systems are all digital and have many bits of precision to see noise and harmonic distortion and IM distortion to >140 dB dynamic range.
But is that enough to see it all? I will contend no, not a chance. For measuring time domain signals like square wave response, we have typically only 8-10 bits of resolution compared to about 20 bits of audible resolution in our hearing. Woefully inadequate to see what's going on in one small case, let alone a large number of cases of signals to observe in the time domain.
Spectrum analyzers can see more of what's going on, with greater dynamic range than most people's hearing. With one big caveat: it can only measure a repetitive waveform that is nonvarying over "infinite time", something no musical waveform ever produces. Sure there's the fourier transform that can show that waveform accurately in the time domain that the scopes cannot, but it only allows for non-dynamic conditions.
Granularity of a signal, dithering of a signal, and random events all caused by an amp, and those do happen all the time, are "averaged away" by the precision spectrum anaylzer, because it only zeros in on the repetitive waveform, and tends to ignore the non-repetitive pieces of it.
Even still, some harmonics and IMD signals are more damaging than other signals and we still are guessing how those varying combinations will make it sound. There are sonic signatures of distortion from minor parts that measure almost perfect, and how can we possibly hear that, swamped out by immense other distortions? I once asked a similar question: how much of the chemical that ruins a fine bottle of wine by making it taste "corked" is required? The answer: 5 parts per billion, or 0.0000005% distortion from that impurity. Other chemicals have much less impact, even if you add some vinegar, or sulphuric acid, or worse.
The only way to start examining the errors of the "perfect sound forever" CD was to have more than 16 bits and 44.1 Ksamples/sec. DVD-audio succeeded in that, by digitally recording dynamic musical waveforms at 24 bits and 192 Ksamples/sec and comparing that to the CD data. They are not perfectly equivalent. The moral of this story: Seeing real time signals out of amps to the extent humans can hear requires a very high sample rate and measurement width (bits resolution) and watching over some time period from a source signal that is not a simple repetitive waveform.
But how to really compare the differences that do show up? How do you really interpret it? Still, there remains only one way - listen to the results in the context of the whole system. All audio designers that fail to do this are being oversimple and too trusting in their limited measurements and ignorant of the true nature of those measurements.
I've worked for a leading test and measurement company as a test engineer for 24 years now, and I think I know what I'm talking about on that subject.
Kurt
What are the instruments we use to measure amps? Signal generators (sine and square wave), oscilloscopes for time domain, and precision audio band spectrum analyzers. A few systems are all digital and have many bits of precision to see noise and harmonic distortion and IM distortion to >140 dB dynamic range.
But is that enough to see it all? I will contend no, not a chance. For measuring time domain signals like square wave response, we have typically only 8-10 bits of resolution compared to about 20 bits of audible resolution in our hearing. Woefully inadequate to see what's going on in one small case, let alone a large number of cases of signals to observe in the time domain.
Spectrum analyzers can see more of what's going on, with greater dynamic range than most people's hearing. With one big caveat: it can only measure a repetitive waveform that is nonvarying over "infinite time", something no musical waveform ever produces. Sure there's the fourier transform that can show that waveform accurately in the time domain that the scopes cannot, but it only allows for non-dynamic conditions.
Granularity of a signal, dithering of a signal, and random events all caused by an amp, and those do happen all the time, are "averaged away" by the precision spectrum anaylzer, because it only zeros in on the repetitive waveform, and tends to ignore the non-repetitive pieces of it.
Even still, some harmonics and IMD signals are more damaging than other signals and we still are guessing how those varying combinations will make it sound. There are sonic signatures of distortion from minor parts that measure almost perfect, and how can we possibly hear that, swamped out by immense other distortions? I once asked a similar question: how much of the chemical that ruins a fine bottle of wine by making it taste "corked" is required? The answer: 5 parts per billion, or 0.0000005% distortion from that impurity. Other chemicals have much less impact, even if you add some vinegar, or sulphuric acid, or worse.
The only way to start examining the errors of the "perfect sound forever" CD was to have more than 16 bits and 44.1 Ksamples/sec. DVD-audio succeeded in that, by digitally recording dynamic musical waveforms at 24 bits and 192 Ksamples/sec and comparing that to the CD data. They are not perfectly equivalent. The moral of this story: Seeing real time signals out of amps to the extent humans can hear requires a very high sample rate and measurement width (bits resolution) and watching over some time period from a source signal that is not a simple repetitive waveform.
But how to really compare the differences that do show up? How do you really interpret it? Still, there remains only one way - listen to the results in the context of the whole system. All audio designers that fail to do this are being oversimple and too trusting in their limited measurements and ignorant of the true nature of those measurements.
I've worked for a leading test and measurement company as a test engineer for 24 years now, and I think I know what I'm talking about on that subject.
Kurt