Kijanki, let me make sure I am tracking with you. You are saying that jitter is important. That jitter can also result from cable induced errors. That re-clocking at the DAC does not necessarily correct for all/any errors related to jitter that could occur during delivery of the raw digital signal through a cable. Does it follow that some digital cables are better at delivering digital signals free of or with less added jitter?
On a related note, in theory or in measurement, can a digital signal be corrupted in a cable, say due to exposure to strong EMF, to the point where 1s and 0s are actually deleted or unreadable at the DAC. I.E. outright data loss?
On a related note, in theory or in measurement, can a digital signal be corrupted in a cable, say due to exposure to strong EMF, to the point where 1s and 0s are actually deleted or unreadable at the DAC. I.E. outright data loss?