Importance of clocking


There is a lot of talk that external clocks because of the distance to the processor don‘t work. This is the opposite of my experience. While I had used an external Antelope rubidium clock,on my Etherregen and Zodiac Platinum Dac, I have now added a Lhy Audio UIP clocked by the same Antelope Clock to reclock the USB stream emanating from the InnuOS Zenith MkIII. The resultant increase in soundstage depth, attack an decay and overall transparency isn‘t subtle. While there seems to be lots of focus on cables, accurate clocking throughout the chain seems still deemed unnecessary. I don‘t understand InnuOS‘ selling separate reclockers for USB and Ethernet without synchronising Ethernet input, DAC conversion and USB output.

antigrunge2

Buffering is not a recipe for avoiding jitter.

Yes, it is.  With asynchronous USB data is delivered in packets (frames) to buffer at some time intervals, for instance 1ms - for instance 44 samples,   DAC delivers each sample to D/A converter in exact intervals (cannot be improved) signaling back buffer over/under flow.  Problem of data coming at different or uneven intervals is eliminated by the buffer and back signaling.  Synchronizing time of the frame delivery with D/A clock won’t help because D/A converter already operates at exactly even intervals.

In case of SPdif buffering in DAC won’t help because without back signaling data night be coming too fast or too slow.  It can be fixed by reclocking or most often to adjust D/A conversion rate to average rate of incoming data.

What might help with asynchronous USB is isolation that prevents injection of electrical noise from computer to DAC.   Reclocking has nothing to do with improvement but people buy it likely because it sounds promising.

OP:

Um, yeah, OK. I’m going to sit here and wait for you to explain to me how on earth you would even reclock a TCP/IP stream without actually buffering it.

Just lemme know when you work out that mathmagic.

 

Erik

I think that minimising jitter throughout the digital chain rather than just at the A/D conversion stage has primary importance. Buffering is not a recipe for avoiding jitter. And it‘s the quality of the clock more than the length of the cables that matters

@erik_squires ”This is why TV’s and streaming devices have relatively large buffers. Everything goes into a bucket which it attempts to keep full from one side, and then carefully doled out by a metronome on the other. That’s where the clock matters.”
 

Nicely put!

Everything upstream of a streamer is what we call asynchronous. The timing is always rather loose as an Internet based connection does not have guaranteed latency or inter-packet arrival time. Attempting to add picosecond clocking to those events which are varying by hundreds of milliseconds is madness.  For reference:

0.1 seconds is 100 milliseconds is 100,000,000 picoseconds and 100,000,000,000 femtoseconds.

This is why TV’s and streaming devices have relatively large buffers. Everything goes into a bucket which it attempts to keep full from one side, and then carefully doled out by a metronome on the other. That’s where the clock matters.

Everything else, IMHO is to keep the noise out of the AC lines and interconnects.

As an example, I've watched my Internet fail and failover to a backup Internet provider.  The process takes around 70 seconds during which I had no Internet.  My music never stopped playing.