Analogue clipping from digital sources


Given the high output (typically >2v in RCA, >4v in balanced mode) there is in my experience a significant risk of overloading both the analogue stage of the DAC and any pre- or power amp downstream. Fighting that with low volume settings on the attenuator only further aggravates the issue. In my case I have to run the InnuOS Zenith’s Mk3 output at 85% to deal with audible overloading of the DAC/ amp with audible distortion. Anyone with similar experience?

antigrunge2

I guess what it boils down to is that in 2024, if you tell me your recent vintage preamp is clipping with 2V of input I am going to have trouble believing it is not malfunctioning.

Are you referring to clipping?  I think it has been mathematically demonstrated that it is possible for PCM to clip during playback even though never detected during encoding...

 

@ltmandella : care to elaborate? Or point to relevant sources? Pls don’t make unsubstantiated assertions when posting. Thanks.

and for everyone else, you can research "intersample peak" and intersample peak overs.  It is absolutely a known phenomenon among mastering engineers and discussed regularly, and demonstrated in various testing. 

The consensus on audibility is that it is hardware (DAC) dependant.  Can be very objectionable on very accurate hardware but generally not audible in lossy or low end repro chain.  To avoid requires headroom in the encoding not always allowed due to the loudness wars.

Probably why I prefer DSD.  I am unfortunately very annoyed by any high frequency or peak glitches in digital.  Makes me want to immediately throw the offending component riight out the window...