You didn’t cause any issues. Questions, explanations, debate and answers have always been welcome at Audiogon.
I was at a dinner some years ago with some really smart people (definitely not me) where part of the room was contemplating investing in the company represented by the other part of the room.
One of the key leaders in the company in question also happened to enjoy music and while he didn’t have an elaborate system by some standards, he was a streaming only user. This was in the earliest (pre JayZ) days of Tidal. Unprovoked, when he found out I was also streaming my music, he became quite animated and we had a terrific conversation. I came away with a quasi explanation that satisfied my curiosity and also has guided my streaming decisions since. I asked the question why some streamed music sounded different than the same track from the same hard drive streamed differently. At the time I was struggling with why USB feeding a Berkeley USB to SPDIF converter sounded better than USB direct to the same dac.
The explanation he gave was well over my head but it was clear to me he had given it a great deal of thought. He explained that not only does ethernet and USB data transmission have error correction built in, it was designed to deal with the errors because there errors are a foregone conclusion. The premise is that errors are to be expected because they will ALWAYS be present so the protocols were designed to deal with the presence of errors. The reason it isnt a big deal with most data is that the absolute timing of the data and more importantly, the lack of requirement for CONTIGUOUS data isn’t necessary in most applications. If a one billion dollar electronic transfer from bank to bank begins “on-time” and concludes 1 millisecond later than expected, it isnt a big deal. In fact, its a desired outcome in that it is likely that the data was error checked. Now, imagine streaming music. It absolutely requires proper buffering. FULL STOP. You cant have the data stream begin and then “error check” on the fly. Jittery at best, drop outs at worst. SO, how is the buffering performed, how much of the data is able to be buffered into memory, what and how is the data streamed from the buffer and what/where/method is the clocking/reclocking performed? There was alot more to the conversation and todays streamers/dacs do a wonderful job with the challenge. Some do it better than others and some do it completely differently.
Anyway, fast forward to today. We are still in the process of understanding why certain materials, lengths of cable, shielding or lack thereof along with which method of transmission provides the best results and which do not. But it is audible. Ive tried to buy equipment from companies that seem to have a grasp of the importance of timing, lower jitter and clocking. The good news for us all is that most of what’s available today does a decent job of dealing with the aforementioned. Its when we get to the point where we are trying to maximize performance. The bits are bits camp just simply havent explored the topic deeply enough to understand how much there is to be discovered.