DACs can have very high line output levels, up to 4V is not atypical for
a balanced DAC’s XLR outputs. Compare this to a typical "audiophile"
analog setup with a 0.5mV MC cartridge into a phono stage with 60dB of
gain, netting 0.5V line output. That’s 18dB lower than the 4V DAC!
That’s not just a huge difference - that’s a world apart.
All correct. But the question is, why is this tolerated? Its not hard to design a balanced out DAC with a proper output level (of course, with stuff all over the map its getting harder and harder to determine what "proper" is, but i would suggest between 1-1.5vRMS full output.
As to the MC - again, what we need there, and all my stuff could do it, even "back in the day" is the provision for > 60dB of gain. I could provide up to 68dB, with a very low-noise, low-impedance balanced input stage.
The only problem was, again the wide variation in cartridges and components such that no one setting for MC or MM would be ideal for everyone. So it could be custom set to whatever a dealer or customers wanted - by special order.
But if designers adhered to even nominal, de-facto standards there would be far less issue. right now I'm designing a product that uses an unconventional part, which is great in many ways but, due to basic physics, will overload with an input of more that 4.25V p-p (divide by 2.82 for rms). this means i MUST have optional pads or really bad things happen. Pisses me off.
End rant :-)