Importance of clocking


There is a lot of talk that external clocks because of the distance to the processor don‘t work. This is the opposite of my experience. While I had used an external Antelope rubidium clock,on my Etherregen and Zodiac Platinum Dac, I have now added a Lhy Audio UIP clocked by the same Antelope Clock to reclock the USB stream emanating from the InnuOS Zenith MkIII. The resultant increase in soundstage depth, attack an decay and overall transparency isn‘t subtle. While there seems to be lots of focus on cables, accurate clocking throughout the chain seems still deemed unnecessary. I don‘t understand InnuOS‘ selling separate reclockers for USB and Ethernet without synchronising Ethernet input, DAC conversion and USB output.

antigrunge2

What about the clock accuracy releasing the buffer, then?
 

It is internal DAC’s clock that is unaffected by any external activity (asynchronous).  DAC uses this clock to take sample of the data from the buffer and convert it to voltage at exact intervals.

It does not matter if this internal clock is inaccurate.  We cannot detect if music plays tiny fraction of percent slower or faster, but we can hear when this clock is jittery because uneven D/A conversions intervals add additional frequencies to audio signal.    We likely cannot hear jitter artifacts when jitter is below about 50ps (peak to peak) so names like “femtosecond clock” are nonsense.
 

Releasing a dirty stream to the D/A increases its processing load leading to distortion. This applies to all dacs but is particularly noxious when using upsampling. That‘s why reclocking the data before conversion has benefits. And by the way, excessive buffering leads to latency while underutilized buffers need empty packets fill-in, again increasing the processing requirement on the D/A converter. Be that as it may and as stated previously: in my setup the benefits of external master clocks is easily demonstrable

Releasing a dirty stream to the D/A increases its processing

No it doesn’t, latency has nothing to do with it and buffers are never “underutilized” because of back signaling.

Perhaps example will help:

Worker (DAC) has to place items on conveyer belt (D/A Converter) of some machine in exactly one second spacing, but another worker (computer) hands them too slow or too fast creating uneven space between them.  Remedy for that would be shelf with many items on it (buffer).  One worker adds bunch of them (frame) to the shelf every so often and the other places items on conveyer belt in even intervals, yelling back “get more (or less) next time” to keep decent amount of them on the shelf (back signaling).

@kijanki

 

You are not addressing my point on processor loading. Or who else is cleaning up the mess?

@antigrunge2 What processor?  The only one that can affect timing of D/A converter is in the DAC.  It loads samples from the buffer into D/A converter at even time intervals.  It also signals back when buffer is too empty or too full. 

Sending asynchronous data in packets is reducing stress on receiving end allowing receiver to get data in the required or convenient time.