How can different CAT5/6 cables affect sound.


While is is beyond doubt that analog cables affect sound quality and SPDIF, TOSlink and AES/EBU can effect SQ, depending on the buffering and clocking of the DAC, I am at a loss to find an explanation for how different CAT5 cables can affect the sound.

The signals over cat5 are transmitted using the TCP protocol.  This protocol is error correcting, each packet contains a header with a checksum.  If the receiver gets the same checksum then it acknowledges the packet.  If no acknowledgement is received in the timeout interval the sender resends the packet.  Packets may be received out of order and the receiver must correctly sequence the packets.

Thus, unless the cable is hopeless (in which case nothing works) the receiver has an exact copy of the data sent from the sender, AND there is NO timing information associated with TCP. The receiver must then be dependent on its internal clock for timing. 

That is different with SPDIF, clocking data is included in the stream, that is why sources (e.g. high end Aurenders) have very accurate and low jitter OCXO clocks and can sound better then USB connections into DACs with less precise clocks.

Am I missing something as many people hear differences with different patch cords?

retiredaudioguy

Do Ethernet cables matter per chatGPT: 

Short Answer: Yes — but only within reason.

Ethernet cables can make a difference in high-end audio systems, but not due to digital data loss — it’s about electrical noise.

📡 Why Digital Bits Still Matter (But Aren’t the Problem)

  • Ethernet uses packet-based transmission. If a packet is corrupted, it’s re-sent — so you still get perfect data.
  • Timing (jitter) is not carried through Ethernet like in SPDIF or AES. Your streamer/DAC reclocks the signal.
  • Therefore, sound quality differences are usually not from bit errors or timing, but from noise entering sensitive gear via Ethernet shielding or ground planes.

A very logical conclusion. 

richardbrand

Qobuz seems to implement a sort of "running" TCP/IP which is bit-perfect for the completed packets already received, but who knows what the internet will regurgitate in the future?

It isn't clear what your claim is here. Qobuz uses TCP/IP - that's standard Internet Protocol. There's nothing unusual about it. It delivers bit-perfect data to your streamer. Whatever shortcomings you might detect with audio streaming, you can be sure the data you're getting from sources such as Qobuz and Tidal are - literally - bit perfect.

This is crazy.  Only the quality of the coax coming into your home affects the sound.  You need to try 3-4 different Internet providers and ask them which brand of coax cable they use to make an informed opinion. 

OK, that's a joke.  What's real is my blog on protecting your network from lightning surges is now online.  Enjoy. 

Post removed 

So, doing a little research on streaming, according to Gemini, they primarily use TCP, not UDP.

As  I suspected, UDP is best used for live streaming, in cases where you would rather have the audio in real time than perfect.  Consider a video call.  Can you wait 10 seconds to hear the others in the meeting? Then use TCP. On the other hand if you’d rather suffer packet losses and be able to carry on a conversation, use UDP. 

In an audio streamer the point is moot anyway as you still get bit perfect transmission from your cable modem to your streamer.  Assuming UDP or TCP.  The de-ionized, cryogenic, thrice blessed cable you use for the last 2 meters won’t change a single bit.  Besides in a bad Wifi environment, any packet losses are going to happen long before your home.

What does matter is the quality of the buffering in the streamer, and how well the clock on the DAC is able to get data when it wants it without a conflict with the input stream.