Importance of clocking


There is a lot of talk that external clocks because of the distance to the processor don‘t work. This is the opposite of my experience. While I had used an external Antelope rubidium clock,on my Etherregen and Zodiac Platinum Dac, I have now added a Lhy Audio UIP clocked by the same Antelope Clock to reclock the USB stream emanating from the InnuOS Zenith MkIII. The resultant increase in soundstage depth, attack an decay and overall transparency isn‘t subtle. While there seems to be lots of focus on cables, accurate clocking throughout the chain seems still deemed unnecessary. I don‘t understand InnuOS‘ selling separate reclockers for USB and Ethernet without synchronising Ethernet input, DAC conversion and USB output.

antigrunge2

Buffering is not a recipe for avoiding jitter.

Yes, it is.  With asynchronous USB data is delivered in packets (frames) to buffer at some time intervals, for instance 1ms - for instance 44 samples,   DAC delivers each sample to D/A converter in exact intervals (cannot be improved) signaling back buffer over/under flow.  Problem of data coming at different or uneven intervals is eliminated by the buffer and back signaling.  Synchronizing time of the frame delivery with D/A clock won’t help because D/A converter already operates at exactly even intervals.

In case of SPdif buffering in DAC won’t help because without back signaling data night be coming too fast or too slow.  It can be fixed by reclocking or most often to adjust D/A conversion rate to average rate of incoming data.

What might help with asynchronous USB is isolation that prevents injection of electrical noise from computer to DAC.   Reclocking has nothing to do with improvement but people buy it likely because it sounds promising.

What about the clock accuracy releasing the buffer, then?

 

I specifically addressed this in my first message. You literally can’t avoid jitter without buffering, but in the case of a network stream from outside the home having more "clocking" at upstream devices, without buffers doesn’t help you because the original stream has huge amounts of packet to packet variation (huge relative to an audio or video playback). In fact you can end up in a situation where your original DAC’s jitter is worse because there’s an upstream "reclocker" that has so much jitter it’s forcing the downstream DAC to misbehave.

The best possible place to put a fancy clock is less than an inch away from the DAC. External clocks are used to produce music in the cases of multiple DAC’s or streams happening at once, as in the case multi track recordings and mix-downs. Those clocks do NOT however dejitter anything.

In my setup both the Etherregen on Ethernet and the UIP on USB can run on their own clocks or supported by the 10m clock. The difference in sound quality isn‘t subtle and that is what my post was about. It seems you are arguing from first principles rather than own experience

What about the clock accuracy releasing the buffer, then?
 

It is internal DAC’s clock that is unaffected by any external activity (asynchronous).  DAC uses this clock to take sample of the data from the buffer and convert it to voltage at exact intervals.

It does not matter if this internal clock is inaccurate.  We cannot detect if music plays tiny fraction of percent slower or faster, but we can hear when this clock is jittery because uneven D/A conversions intervals add additional frequencies to audio signal.    We likely cannot hear jitter artifacts when jitter is below about 50ps (peak to peak) so names like “femtosecond clock” are nonsense.