Some companies actually do measure the cables and the info and graphs are on the websites.What I have found is it doesn't matter how the cables measure if the components(most of them) are unable to send and recover a perfect signal.Trial and error with different cables until you find the one by'happy accident' that seems to counteract the errors and allow your system to sing.The perfect cable could be one buried in your junk box,an expensive boutique,or anything in between.
- ...
- 113 posts total
Williewonka, the first layers of ethernet don't have checksum and you still need a shitty cable to fail at the physical or data link layer. Statistics on my server (current uptime 27 days) : Iface MTU RX-OK RX-ERR RX-DRP RX-OVR TX-OK TX-ERR TX-DRP TX-OVR FlgIt's probably a bit messy to read here, but you can see there are no TX or RX errors at interface level (eth0 is the physical interface, ignore the 'lo' interface, that's the loopback aka 127.0.0.1). Anyway, no fails on +200 million packets (RX and TX combined) on a 1000baseT connected with a 5$ Cat5e cable (length 5m). PS : USB has a hardware checksum test, but USB audio/video devices use isochronous communication to avoid latency (thus no hardware checksum for that). |
@danip4 - I’m far from an expert on this topic, but this thread appears to show Checksum’s are employed in Ethernet... https://networkengineering.stackexchange.com/questions/37492/how-does-the-tcp-ip-stack-handle-udp-ch... What I can confirm is that using Ethernet, conveys a significantly better audio result than any of my asynchronous mode transfers i.e. USB, SPDIF or Optical. Each of those required the best cable possible to even approach the same levels of sound quality that Ethernet provides. So I think my point still stands - the quality of sound from a digital source that uses asynchronous transfer protocols can be impacted by the quality of the cable used Regards |
- 113 posts total