Audiophile Loudness Wars—Too Much again!?


Obviously a huge chunk of popular western music has been involved in the loudness wars of the recording industry. But do we now have Hifi loudness wars? Sometimes I look at or try out new pieces, and think audio designers are putting too much gain in our preamps, amps, and DACs. Or am I off here? You won’t hurt my feelings.
I’m getting a sense that lots of gear falls over the side of center towards higher gain.

Hey, it happens to me! You can’t edit titles. It’s supposed to be “too much gain”.
128x128jbhiller
thinking about many of the responses, i suspect we are confusing several issues:  gain, recording volume, use of limiters and compressors to increase average playback level etc.  Unless you are running a very unusual piece of equipment "more gain" will not cause those issues (such as **compression** or **loud playback levels**)  on its own.  It simply will play back louder for any given volume knob setting.

The only real issues are 1) lack of granular volume control and channel tracking when using only, say 12-->4 positions on the volume control.  The other potential (2)  associated problem is potential overload of any first preamp stage (only preamp stage(s) before the volume control is implemented - all others buffered by said volume control)


can't speak to the issues re: separates but i adjust gain on my DAC and mixer (2 channels - 2 tt's) as needed and generally leave the volume on my integrated alone (at about 11 o'clock) unless i want to get Very loud. This is the best way i've found to keep volume consistent and under control, regardless of source 
Someone mentioned component matching.
I agree much of the issue centers on this. Especially if you are mixing tubes and solid state pre amps and amps.
You can’t please everyone, or optimize a component for every possible system. It’s perhaps not so much a matter of "hifi loudness wars" as it is "analog vs digital wars". DACs can have very high line output levels, up to 4V is not atypical for a balanced DAC’s XLR outputs. Compare this to a typical "audiophile" analog setup with a 0.5mV MC cartridge into a phono stage with 60dB of gain, netting 0.5V line output. That’s 18dB lower than the 4V DAC! That’s not just a huge difference - that’s a world apart. I believe those with tape decks can have similar issues.

Preamps in particular - analog audiophiles often find usefulness in high-gain preamps with up to 20dB or more (!!) of gain. That same gain level will be quite ludicrous when hooked up to a digital source. That’s when you get guys saying "I can’t go past 9:00 on volume before it’s blasting me out of the room".

Then beyond the preamp/source stage you have to match between sensitivity of your amps vs. speakers. So a worst case scenario of bad gain structure system building "it’s too loud as soon as I move the volume at all" would look like this:

* Balanced XLR DAC 4V
* Tube preamp 23dB gain
* High sensitivity high power amp 30dB gain (high power PP tube amps typically have the highest gains)
* High sensitivity speakers - upper 90s dB/Watt

The panacea is to learn gain structure and how to build a system that correctly meets your needs. So e.g. if you want to run both a low-output MC cartridge and a digital source, you’re more likely going to want heaps of gain in your phono stage (70dB or so), and a preamp with a modest 8dB - 14dB gain (or even lower).

Honestly, even as an analog guy I find preamps with 20+ dB of gain to be too much (though I do have very sensitive speakers downstream). You inevitably start hearing the noise floor from all that gain (again, on sensitive speakers and amps). And there’s still a surprising number of tube preamps with this amount of gain. I think this comes from the days when CJ and ARC would make "full function" preamps with an MM phono stage that *could* also run lower output MC cartridges by tapping into that surplus of line stage gain. But it was a bad match when the owner hooked up a digital source. Also this may come from the ubiquity of 12ax7 tubes & circuits. If you see a tube preamp with 12ax7 tubes in it - you KNOW it’s gonna have an absurd amount of gain.
DACs can have very high line output levels, up to 4V is not atypical for a balanced DAC’s XLR outputs. Compare this to a typical "audiophile" analog setup with a 0.5mV MC cartridge into a phono stage with 60dB of gain, netting 0.5V line output. That’s 18dB lower than the 4V DAC! That’s not just a huge difference - that’s a world apart.

All correct.  But the question is, why is this tolerated?  Its not hard to design a balanced out DAC with a proper output level (of course, with stuff all over the map its getting harder and harder to determine what "proper" is, but i would suggest between 1-1.5vRMS full output.

As to the MC - again, what we need there, and all my stuff could do it, even "back in the day" is the provision for > 60dB of gain.  I could provide up to 68dB, with a very low-noise, low-impedance balanced input stage.

The only problem was, again the wide variation in cartridges and components such that no one setting for MC or MM would be ideal for everyone.  So it could be custom set to whatever a dealer or customers wanted - by special order.

But if designers adhered to even nominal, de-facto standards there would be far less issue.  right now I'm designing a product that uses an unconventional part, which is great in many ways but, due to basic physics, will overload with an input of more that 4.25V p-p (divide by 2.82 for rms).  this means i MUST have optional pads or really bad things happen.  Pisses me off.

End rant :-)