Why some USB cables are better is due to better edge detection on an analogue square wave. Well of course it’s not perfectly square, just as a sign wave from a power station isn’t a perfect sinusoidal wave.
My qualification is that I am an Instrumentation technician, and Automation programmer. Please go and do some actual studies and research before claiming that there is no difference between different conductors and cable materials and geometry/shielding. Get on a decent oscilloscope and go see for yourself. I have worked in a business that controls robots, and they used EtherCat protocols. I have spent hours in a classroom studying theory, in order to get my qualifications.
For example Cat 5, 5e, 6, 6a, 7, 8 Ethernet cable standards.
This isn’t for fashion, there are actual differences, measurable differences in performance.
Try a quality USB (I used to work for a reasonably well known USB cable maker in Australia, an an employee) with a quality USB DAC sometime and hear the difference.
Digital is an encoding system, transfer of data is all totally analogue and electrical. Edge detection is to measure over time when a electrical signal is powered and the circuit is powered down, noise is a real issue.
There’s nothing in the cables passing little ones and zeros across the cable.
EDIT: excuse my rant, I am just trying to ensure people don't get misinformed and miss out on relatively cheap solutions that will significantly increase performance.