ETHERNET CABLES


When using ethernet for hooking up streaming devices and dacs, what cat level of  ethernet cable should be used. Is there any sonic improvement by going to a  higher dollar cat 7 or 8 cable?

128x128samgar2

@fredrik222 ,

Just as I thought: a lazy authoritarian. The evidence is out there for anyone to see yet you won’t take a look. I’ve been around long enough to know that anything I post or link to will be rejected, outright.

I’ve done it enough times here that I won’t do it anymore. You’ll just flat out reject it due to your confirmation bias. Check out the search engine, above, and take a gander but, as I already said, you’re a lazy authoritarian.

Tell me freddy, are you satisfied with your cable TV picture? It comes approximately the same way as your music. Have you compared it to say, the blu ray equivalent? Is not the blu ray much, much better? (I hope and pray you do or this is one big meaningless waste of time)

Now, do you think that the signal you’re getting is as pristine as from a CDP? Do you think all the extra boxes and cables don’t add something to the mix? You really should look into some kind of deprogramming course.

All the best,
Nonoise

@cleeds 

Never said so, but what I did say was there is no theoretical improvement from going crazy and using fiberoptic and other crazy things for a 10 ft run in a residential applications. Simply put, if you hear an improvement, you are making yourself hear it subconsciously in a residential application. 

It's different if you were to run 300 ft of ethernet through a factory as an example. But if you do that and plug into your streamer, you have other issues with audible noise drowning out your music. 

If you want to learn about how cables actually work in Ethernet:

 

https://www.cablinginstall.com/home/article/16467568/the-myths-and-realities-of-shielded-screened-cabling

 

Everyone keeps focusing on the cable and noise that may couple into the signal on the cable, but the key here is what occurs after the signal leaves the cable in the Ethernet receiver. In a 3 or even 10 foot run, a tiny amount of noise can couple into the signal. However, in the receiver, the received signal is compared to reference or threshold values and based on this comparison an entirely new signal is created. The received signal (and any noise) is effectively discarded. The newly created signal is output at the correct voltage level without the noise. This new signal could be transmitted for another 300 feet (or some other distance based on data rate) or passed on to the internals of the streamer for further decoding. At each hop in the communication path, the signal is re-created and retimed thereby creating an entirely new and clean signal.

While analogies are often imperfect (and I am told, mine are often terrible), think of a piece of blank 8.5x11 inch paper sent through the mail. The post office may crumple it or bend it such that when received at your house, it is crumpled (noisy). However, the person receiving the paper can tell that it is blank piece of 8.5x11 inch paper (albeit crumpled) and they get a brand new piece of paper out of the drawer and toss the crumpled paper in the trash. The received paper was just used as a reference to know what size of new paper to pull from the drawer. Same with an Ethernet receiver. It compares the voltage value of the received signal to a threshold value, to create an entirely new signal (without noise) and the received signal (with noise) is discarded. This is unique to the digital domain and does not occur in the analog domain.

In a PAM4 system, typical of Ethernet, the signal is transmitted at one of four voltage levels, such as 0, 1, 2, or 3 volts. The received signal will vary some what due to noise and effects of the channel. For example a signal originally at 2 voltes, that is transmitted over a long cable run, could be received at 2.2 volts or even 1.7 volts. It would be compared to the four voltage levels and an entirely new clean signal at 2 volts outputted, which is the closest voltage value. The original signal is effectively discarded. This is the benefit of digital communication over analog communication over long distances (and short distances).

Based on this method of operation, the tiny improvement a better cable provides will not yield a different outcome when the signal is re-created.  A signal transmitted at 2 volts might be received with a shitty cable at 2.1 volts and with a amazing cable at 2.08 volts (small improvement).  Both will be compared to the threshold values (0 volt, 1 volt, 2 volts, 3 volts) and the receiver will output a clean 2 volt signal.  

Peace.

 

@nonoise  

My post was deleted when I asked you put up any evidence for your point. Regardless, there is no evidence for this. I've posted several links showing why noise in ethernet is a non-issue for residential applications. 

 

And you analogy about streaming TV vs Blu-Ray really shows your lack of understanding of anything relevant at all. The primary difference is bitrate for video and multichannel audio, and it has nothing todo with any type of noise at all. Cable TV is also using a lower bitrate, and is typically compressed with lossy compression. 

 

Here's one link that explains it to you:

 

@12many ​​​​@fredrik222 I’d love to believe you guys that Ethernet cables make no difference, and I totally get where you’re coming from about the physics saying they should make no difference. But, I’ve heard enough people here say they not only make a difference, but a pretty sizable difference. In fact, everything in streaming seems to make a big difference. I started streaming from my iPad through an upgraded lightning to USB cable, which was a big improvement over the Apple Camera Adapter, but when I added an iFi Zen Stream things improved exponentially and actually surpassed spinning CDs. So, I bought a mid-priced Wireworld Starlight 8 Ethernet cable I’m gonna try out between my router and streamer and see if it makes a difference versus a garden variety Ethernet cable. If I hear no difference, I’ll just return it. But at least I tried and accepted MAYBE there’s something I didn’t know and just trusted my own ears. You know, CDs were “perfect sound forever” until they finally figured out how to measure jitter, timing, and noise that turned out to be BIG problems and why many audiophiles just stuck with vinyl until manufacturers figured it out. I might hear no difference between the two Ethernet cables, but if I hear a difference maybe sometime in the future they’ll be able to measure it and explain why. But if not I’ll just return the Wireworld cable and no biggie. I also just upgraded from a very decent Apogee WydeEye professional digital cable, which I’ve loved and stuck with for 10 years, to an Acoustic Zen MC2 that is notably better in absolutely every parameter — not even close in a direct A/B comparison. At some point in the past someone would’ve said there couldn’t be any difference because it’s just ones and zeroes (and yes, they’re both 75-Ohm cables), but here we are. I guess what I’m saying is, rather than just blindly walling it off why not just try a “better” Ethernet cable and if you don’t hear a difference just return it? What’s the harm? Worst case is you’ll just prove your point to yourselves, and if not you’ve benefitted from better sound quality and learned something. That’s a win-win either way in my book.