Analogue clipping from digital sources


Given the high output (typically >2v in RCA, >4v in balanced mode) there is in my experience a significant risk of overloading both the analogue stage of the DAC and any pre- or power amp downstream. Fighting that with low volume settings on the attenuator only further aggravates the issue. In my case I have to run the InnuOS Zenith’s Mk3 output at 85% to deal with audible overloading of the DAC/ amp with audible distortion. Anyone with similar experience?

antigrunge2

Technically, the best solution is to lower the output after bit depth conversion (16 to 24 or 32 bits) but before upsampling.

Roon calls this setting Headroom Management.

The end result of this 2-step dance is you can reduce the maximum output without reducing resolution while minimizing how much about the original recording you must know to avoid clipping.

I am on InnuOS Sense feeding into the Antelope Zodiac Platinum DAC and believe I can only use the Sense imbedded attenuator. Any other suggestion? Many thanks for your help, much appreciated!

BTW: this subject should find a wider audience.

Here is @atmasphere on a different thread addressing my topic:

“IMO Phillips and Sony made a stupid mistake when they set the Redbook spec to 2V output with digital gear, more than many amps need to overload. I think their reasoning must have been that once you hear digital, you'll never want to hear any other source. Obviously if that was the thinking, it was grossly incorrect.

A smarter thing to do would have been to allow for a lower level DAC output in addition to the regular line section that's built into all DACs and CD players. This way if you happen to have a phono, tuner, tape machine or other source (perhaps video) you could use a regular preamp and get maximum fidelity....”

So in addition with @erik_squires insightful comment, there is an overall issue along the chain

“IMO Phillips and Sony made a stupid mistake when they set the Redbook spec to 2V output with digital gear, more than many amps need to overload.

The CD was invented in 1982, with full knowledge of this. The reasoning is that a PREAMP could/can easily handle 2V input. Amps may not, but since the idea was never to directly connect a CD player at full output to an amp, I’m not sure why this is an issue. Also, higher voltage = less noise (its complicated) and less need for additional gain downstream.

Also, Ralph is right for an AMP, 2V might be overload but preamps have been 100% aware of the CD standard since then and are built for it, so I disagree. Preamps (and preamp stages) today can easily handle 2 V input and put out whatever arbitrary fraction of that you need for an amp.

Older gear though had far too much gain or too low a supply rail which could cause an issue.  By too much I mean it had a lot more gain than we can use, which contributed to noise.  Better to have lower gain and wider use of the volume knob.

Also, Ralph is right for an AMP, 2V might be overload but preamps have been 100% aware of the CD standard since then and are built for it, so I disagree.

@erik_squires FWIW the high output digital problem is one of the issues that any preamp manufacturer has to find a way to deal with. With a phono section or tuner, you might need 15dB of gain to work with most power amps, but you (most of the time) don't need any for digital.

We've been lucky in that our patented direct-coupled output section of our preamps is neutral enough and is able to prevent coloration from the interconnect that there is still a benefit using our preamps with a digital source.