Can the digital "signal" be over-laundered, unlike money?


Pretty much what is implied by the title. 

Credit to @sns who got me thinking about this. I've chosen a path of refrain. Others have chosen differently.

I'm curious about members' thoughts and experiences on this? 

Though this comes from a 'clocking thread' by no means am I restricting the topic to clocking alone.

Please consider my question from the perspective of all ["cleaning"] devices used in the digital chain, active and passive.

 

From member 'sns' and the Ethernet Clocking thread [for more context]:

 

"I recently experienced an issue of what I perceive as overclocking with addition of audiophile switch with OXCO clock.  Adding switch in front of server, NAS resulted in overly precise sound staging and images."

"My take is there can be an excessive amount of clocking within particular streaming setups.

...One can go [to0] far, based on my experience."

 

Acknowledgement and Request:

- For the bits are bits camp, the answer is obvious and given and I accept that.

- The OP is directed to those that have utilized devices in the signal path for "cleaning" purposes.

Note: I am using 'cleaning' as a broad and general catch-all term...it goes by many different names and approaches.

 

Thank You! - David.

david_ten

I believe I mentioned this in prior post on another thread.  I think we can all agree optimal network performance requires galvanic isolation, proper timing, maximum jitter reduction, shielding from emi/rfi. With so many choices of equipment to address these issues, highly likely every streaming solution is unique. What works for one situation may not work for another, this especially true at the margins when one has optimal or near optimal setup already. One may upset delicate balance they may have achieved by adding another network appliance.

 

One can speculate or presume my issues with switch were due to inferior clocking, poor implementation, inferior parts. Perhaps a higher quality switch would further optimize my network, perhaps not, only insertion of such a switch would provide empirical evidence.

 

At this point I question how does one know when network is optimized? If one's system is providing high resolution, natural timbre, balanced tonality, freq extension at both ends, wonderful micro and macro dynamics, precise and natural sound stage, imaging, is that not proof of optimized network? Is there a point where we can say enough is enough?  The conundrum is this is one of those known unknowns, the reason so many are never satisfied. We can't know if our present networks are optimized until we've tried any number of other network configurations.

 

While I try never to say never, I'm at the point where I'm satisfied with present network, other bigger fish to fry. My take is until we have all fiber solution, I'm done.

Post removed 

Cost <$100 to "isolate" USB. Most tolerable DACs already isolate SPDIF (something that most audiophiles are clueless about -- going on about isolation, TOSLINK, etc.). So there goes your isolation argument. Isolate USB and with SPDIF on most tolerable DACs, no RF, there goes that argument. Timing? No timing in USB, not timing in Ethernet (also isolated and fairly immune to RF too). $15 DAC chips can remove nanoseconds of jitter, so that there goes that argument.

 

I don't want to misinterpret you or misquote you, so could you please write this more clearly before i reply? For example, are you saying that timing/jitter does not matter on the USB interface? If so, you are confusing a purely data signal with the quasi-analog signal that is fed to a DAC.So, please clarify the whole thing. Thanks.

 

G

I don't want to misinterpret you or misquote you, so could you please write this more clearly before i reply? For example, are you saying that timing/jitter does not matter on the USB interface? If so, you are confusing a purely data signal with the quasi-analog signal that is fed to a DAC.So, please clarify the whole thing. Thanks.

Timing/jitter on the USB interface does not matter. This does not have any impact on the DAC analog output which uses a completely separate clock. That does not need to be a very expensive clock for very good audio performance. Sure, lots of expensive equipment makers say it does, but they can't ever support that claim. Chip based DACs, new ones at least, not 30 year old ones, are more immune to clock jitter as well.

Saying "quasi-analog" is marketing speak. It has no meaning. Clock jitter has meaning, namely clock jitter at the input to the DAC.

Electrical noise on the USB I/F due to ground loops is an issue, hence why I addressed isolation.

Please don't come back with pseudo-technical marketing fluff. That is not going to cut it except with people who have also drunk the Koolaid.

Competently designed DACs with asynchronous USB will buffer the input and use their own clock so no matter how many of these devices precede the DAC they won't affect the analog out.