Bill, I hear you. But I am optimistic that we are getting to that point where they won't matter.
Optical has dispersion and the O/E converter generates noise. Therefore, you are technically correct that you can't eliminate noise altogether. But optical completely isolates noise from the source, as well as EMI and RFI that are otherwise introduced through an electrical cable.
All digital generate their own internal jitter, including asynchronous DACs. Therefore, you are technically correct that you can't eliminate jitter altogether. But you can completely isolate source jitter.
Bit errors will always exist. But they can be corrected.
In summary, we have ways to correct bit errors 100%, to completely isolate source noise, and to render jitter irrelevant. What we don't have is all three in one design ... yet.
Of course, this is limited to the DAC. Even if the DAC signal is pristine it can be contaminated by EMI/RFI upon leaving the DAC.
But that's a different world we are talking about.
Optical has dispersion and the O/E converter generates noise. Therefore, you are technically correct that you can't eliminate noise altogether. But optical completely isolates noise from the source, as well as EMI and RFI that are otherwise introduced through an electrical cable.
All digital generate their own internal jitter, including asynchronous DACs. Therefore, you are technically correct that you can't eliminate jitter altogether. But you can completely isolate source jitter.
Bit errors will always exist. But they can be corrected.
In summary, we have ways to correct bit errors 100%, to completely isolate source noise, and to render jitter irrelevant. What we don't have is all three in one design ... yet.
Of course, this is limited to the DAC. Even if the DAC signal is pristine it can be contaminated by EMI/RFI upon leaving the DAC.
But that's a different world we are talking about.