Standard output voltage for RCA line-outs DAC/CD/Streamers, etc?


Hi there!
I have a CHORD DAC64 converter, which outputs 3V as line output (XLR seems to be also high-voltage), and this voltage seems to be overloading my amp's inputs.  This was also noticeable on my previous amp.
What is the de-facto standard voltage for line-outs in consumer/hiend electronics?
Any other parameters that needs to be checked when matching electronics?
Thanks!
ja_zz
Not sure there really is a standard, esp. w respect to DACs.  As I understand it, traditionally, line level for consumer equipment is typically about 0.5-1 volt. 
I was under the impression that most CD players and DACs had a 2V output, though there was no standard.  Your Chord does sound high.  If you're running it directly into an amp, you might consider using some kind of passive attenuator between the DAC and the amp.
Although I haven’t ever looked at the actual document, my impression is that the Sony/Philips "redbook" spec for the cd medium, that was originally issued ca. 1980, specified a "full scale" (maximum) output for cd players of 2 volts. That would be for unbalanced outputs, and would correspond to 4 volts balanced. Those numbers are much higher than had been (and usually still is) the case for analog sources.

Nevertheless it is common for many digital sources to exceed those numbers by up to around 25%. The 3 volt/6 volt numbers for your Chord DAC, though (which John Atkinson measured as being closer to 3.1/6.2 volts) are the highest I’ve ever seen.

As rcprince suggested, a pair of inline passive attenuators, such as the Rothwells which are offered in both unbalanced and balanced form, may be a good solution. As I mentioned in one of your other recent threads some users (including me) have had positive experiences with them, while others have reported that they compromise dynamics to some degree. The low output impedance and apparently hefty drive capability of the Chord, as well as the high input impedances of the amps you indicated in the other thread that you were considering, would seem to work in the direction of favoring a positive outcome with them.

Another approach, if applicable, might be to drive balanced inputs of the integrated amp or preamp with the unbalanced outputs of the Chord, via an adapter such as this one. That would cut the voltage seen by the amp or preamp in half, compared to a balanced-to-balanced connection. Whether or not there would be adverse sonic consequences resulting from that approach is equipment dependent and probably unpredictable.

Regards,
-- Al

Dear all,
Thanks for your input.

almarg,
I do remember and do value your input.  I am in the process of getting antennuators for line ins.  However, my question was on how much I actually need to antennuate the signal.  Now I know.

Also, all kinds of adaptors seem to be deteriorating the signal due to additional connections...  On top of that the amps I am planning to use are all unbalanced, so I don't have an option of converting unbalanced into balanced signal.

But thanks again.
Also, all kinds of adaptors seem to be deteriorating the signal due to additional connections...  On top of that the amps I am planning to use are all unbalanced, so I don't have an option of converting unbalanced into balanced signal.
If by any chance any of those adapters are adapting XLR outputs to RCA inputs, adverse effects (and very possibly SEVERE adverse effects) would certainly be understandable.  Most such adapters connect one of the two signals in the balanced signal pair (usually XLR pin 3) to ground (XLR pin 1).  The output circuits of some (although certainly not all) components will not be able to tolerate that.  See this thread for example.

On the other hand, in most circumstances which involve adapting an RCA output to an XLR input those two pins should be and usually are connected together by the adapter.

Good luck.  Regards,
-- Al