... could it be possible that the impedance of the RCA only has to be greater than 50,75,101 ohms?
@williewonka
Hi Steve,
Thanks for providing the additional background. But no, for accurate minimally distorted transmission of digital waveforms, the "characteristic impedance" of a digital cable, the output impedance of the component providing the digital signal, and the input impedance of the component receiving the signal, should all be very close to being exactly the same.
In saying that, I should emphasize two things. First, this has nothing to due with accurate conveyance of the digital bits, which figures to be perfect in any half-way reasonable interconnection. Second, the distortion of the waveform that is received by the destination component does not by any means have any **direct** relation to distortion that may result in the sound that is eventually heard, as it might in the case of transmission of analog signals. But what may happen is that the frequency components corresponding to that distortion of the digital waveform, which will be well into the RF region, probably at several tens of MHz or even more, may find their way past the receiving circuit (via grounds, power supplies, stray capacitances, etc.) and contribute to timing jitter at the point of D/A conversion, or to intermodulation or AM demodulation or other such effects at analog circuit points further downstream. All of this being very much dependent in unpredictable ways on the particular components, on the normally unspecified "risetimes" and "falltimes" of the signal provided by the source component (i.e., the amount of time required for the signal to transition between its two voltage states), and on the length and several other characteristics of the particular cable.
Ground loop effects between interconnected digital components can also be cable-sensitive, btw, potentially resulting in low-level high frequency noise being injected into the DAC or other receiving component, with consequences similar to those I’ve described above.
All of this is completely different than the potential effects of cables conveying analog audio signals, due to the vastly higher frequency components of digital audio signals. In the latter case bit rates are already well into the RF region (roughly between 1.5 and 10 MHz or so, depending on whether it is redbook data, 24/192 data, etc.), where imprecise impedance matching can degrade waveform quality due to signal reflection effects. But the frequency content of the risetimes and falltimes of all of those signals are at even much higher frequencies, at several tens of MHz even for redbook data as I said.
One thing that follows from all of this, IMO, is that any similarity that may be observed between the sonic effects of various metals and various geometries as used in analog audio cables vs. as used in digital audio cables is entirely coincidental.
And, finally, I would approach with a great deal of skepticism any claims that a specific wire type can simultaneously be optimal for 50 ohm, 75 ohm, and 110 ohm digital applications. And I feel pretty certain that nearly all other experienced digital circuit designers would agree with me. While it may not make any difference in some systems, or it might even make a difference in some systems that seems to be subjectively preferable, why do what amounts to introducing a known design flaw into the system?
Best regards,
-- Al