Network Switches


david_ten
@jason_k2017,
I can grasp lots of things, thank you, I do so by keeping an open mind.
Even at my ripe old age of 65 my eyes can still see the difference between 1080 and 4K TV on a decent set up (that was one bad analogy, by the way).

All the best,
Nonoise
+1, @nonoise. 
I see that @jnorris2005 still trying to convince us with the same old analogy of signal passing through hundreds of routers, repeaters, data centers...blah blah.

It’s the same signal / data sent to each subscriber yet our experiences differs, why....because how we choose to decode that incoming signal in the comfort of our homes. What these naysayers failed to understand is this, the same identical signal or data stream sent to his or my home results in different aural experience based on our choice of electronics. In jnorris whimsical world a $5K streamer/DAC shouldn’t sound any different than $40 Google Chromecast streaming device because we are dealing with a same digital signal, basically 1’s and 0’s.

In a digital chain everything matters.

Let’s take the 4K analogy, why does X brand of TV able to display the identical 4K stream better than Y brand of TV. It’s all in implementation and how we choose to decode that incoming audio or video signal.
You appear to be attempting to communicate a level of technical astuteness with your post, but you appear to have missed the point completely.
Almost no one in this thread is questioning the digital communication of bits perfectly ... minus the standard data errors that do occur, albeit almost never with the confines of the last data transfer. With few exceptions in this thread, and we can discount them, has anyone suggested that the signal or data degrades. That you are using that argument suggests you have not read this thread or do not understand the contents of this thread.

What has been communicated, specifically by Almarg, is that the Ethernet connection, while transformer isolated, is still an entry point for EMI, both magnetically and capacitively.




jnorris200575 posts11-04-2019 12:01am

The fact is that this signal has passed through hundreds of routers, repeaters, data centers, and switches prior to arriving at your router. Are we to understand that all those networking devices have had no effect on the signal, thus allowing that signal to arrive at your digital doorstep in pristine condition? Are we to further understand that the only place deterioration of the signal can occur is within the final switch and hence that switch needs to be a magical audiophile switch.

Your whole argument sounds like the same pseudo-scientific verbiage used to describe other incredibly overpriced nonsense products that plague hi-end audio.  

Actually 24 frames per second was a compromise, not perfect, but good enough in the early days when film was expensive. As "film" dies, and costs decrease, we will move towards higher frame rates to improve perception of motion. You know that "film" was flashed 3 times per second, which again was not ideal but an acceptable compromise.
Being able to see differences between 1080P and 4K is more a factor of corrected vision and screen distance, not age, at least to a somewhat advanced age unless there is specific macular degeneration.

Knowing everything about CODECs, digitizations and the limits of human hearing will not tell you anything about a specific DAC implementation and its susceptibility to EMI.

jason_k20176 posts11-04-2019 12:56am@nonoise


@jason_k2017, are you suggesting that what you're hearing is exactly as it was encoded? That your DAC is perfect?
The one thing I do know is that if you decode an encoded stream in a quality DAC with the the same codec with which it was encoded then it will be as near to perfect as anything else I can throw at my sound system. My turntable (which is no longer used) is not perfect, my CD player is not perfect, my DB receiver is not perfect. Nor are yours. None are

You should read up a little on codecs, digitzation and the capablilities and limitations of the human ear. There is a point at which nobody can detect an changes to audio. If you can't grasp that, think of your eyes instead and why film only needs only to be at 24 frames per second and why you have to be reasonably young to see any difference between 1080 and 4K TV

@atdavid

With few exceptions in this thread, and we can discount them, has anyone suggested that the signal or data degrades.
This has thrown me
Yes, many of you have. It is the whole point of this thread and my original question

Several people have states that a 'special' switch will prevent it from doing so. Or are you now saying that a special switch is totally unnecessary as the signal or data does not degrade ? Please make up your mind