Digital audio signals have significant frequency components extending up to several tens of millions of Hz, and even higher in some cases, that are associated with the very fast transition times between their two voltage states (i.e., the "risetimes" and "falltimes" of the signals). At those frequencies even minor differences in how the connectors are soldered at the two ends, as well as other tiny mechanical asymmetries, can affect signal reflections that occur at RF frequencies as a result of less than perfect matches of "characteristic impedance" between the components, the connectors on the components and on the cable, and the cable itself. And at RF frequencies impedance matches are never 100% perfect. Signal reflections will in turn combine with and affect the characteristics and quality of the original signal waveform. Which in turn may affect timing jitter at the point of D/A conversion, depending on many component and cable-dependent variables, including the arrival times at the DAC of reflections and re-reflections. Which in turn may vary depending on the degree to which the impedance match at each of the cable is less than perfect, and therefore on which end of the cable is connected to which component. The length of the cable, the propagation velocity of the particular cable, and the jitter rejection capability (if any) of the particular DAC are among other variables that factor into all of this, BTW.
Not to mention that the Adcom cable that was referred to may not have had a well controlled 75 ohm characteristic impedance to begin with, which would certainly affect susceptibility to reflection effects.
Put simply, if the cable was used for a digital interconnection the cited experience says nothing about wire directionality.
Regards,
-- Al