Celo,
I've found that the Aries Mini does seem to sound better, more detailed and more extended. My guess is that it's due to advances in network chips and clocking causing fewer errors and better data throughput. The SBT was a cool product that Logitech decided to abandon, so it never was updated any further. Still perfectly viable, but I think newer technology like the Aries Mini is better. The only constant in the digital world is change, and that change happens much faster than we usually like or expect!
dlcockrum,
Let's be clear, I am not discounting the fact that different cables can sound different, what I'm trying to discern is why that would be the case. Keep in mind that the operative word is "IF".... IF the data structure is exact at both ends of a cable, IF the cable allows said data to clock in at the proper time, IF the cable ensures that data voltage levels are consistent and correct, then no other cable can improve on the audible results. If the data is correct, it's correct. Differences can be had past the receiver/transmitter chip in whatever device you use, like in a digital filter, DAC, oversampling system, etc., but a digital cable can't change the sonics UNLESS it's changing the content of the data in some way. That said, it's very possible that is happening, but it's interesting that there are so many different manufacturers with different USB cables at different price points; how are they manipulating that data stream to get different results? That's my question. And, how do we know when a cable does get the data right versus when it doesn't? Are we really hearing "improvement" or just something "different?
By the way, this question should be the same with any digital transmission standard, be it USB, CAT5, CAT6, S/PDIF, whatever. A digital signal being transmitted must ultimately be converted back to an analog output at some point when we're talking about audio. Each of those has an industry accepted standard that must be adhered to. Now, are the standards themselves not able to hold muster with the data they are transmitting, or are we playing with bit streams within cables to "filter" for a particular sound profile? In practice, no digital transmission is without error, that's why there's error correction all over it. But, in the end, how do we ensure that have the best way of ensuring that a digital data stream is as bit perfect as it can bee at the input of a DAC? That's what matters.
I've found that the Aries Mini does seem to sound better, more detailed and more extended. My guess is that it's due to advances in network chips and clocking causing fewer errors and better data throughput. The SBT was a cool product that Logitech decided to abandon, so it never was updated any further. Still perfectly viable, but I think newer technology like the Aries Mini is better. The only constant in the digital world is change, and that change happens much faster than we usually like or expect!
dlcockrum,
Let's be clear, I am not discounting the fact that different cables can sound different, what I'm trying to discern is why that would be the case. Keep in mind that the operative word is "IF".... IF the data structure is exact at both ends of a cable, IF the cable allows said data to clock in at the proper time, IF the cable ensures that data voltage levels are consistent and correct, then no other cable can improve on the audible results. If the data is correct, it's correct. Differences can be had past the receiver/transmitter chip in whatever device you use, like in a digital filter, DAC, oversampling system, etc., but a digital cable can't change the sonics UNLESS it's changing the content of the data in some way. That said, it's very possible that is happening, but it's interesting that there are so many different manufacturers with different USB cables at different price points; how are they manipulating that data stream to get different results? That's my question. And, how do we know when a cable does get the data right versus when it doesn't? Are we really hearing "improvement" or just something "different?
By the way, this question should be the same with any digital transmission standard, be it USB, CAT5, CAT6, S/PDIF, whatever. A digital signal being transmitted must ultimately be converted back to an analog output at some point when we're talking about audio. Each of those has an industry accepted standard that must be adhered to. Now, are the standards themselves not able to hold muster with the data they are transmitting, or are we playing with bit streams within cables to "filter" for a particular sound profile? In practice, no digital transmission is without error, that's why there's error correction all over it. But, in the end, how do we ensure that have the best way of ensuring that a digital data stream is as bit perfect as it can bee at the input of a DAC? That's what matters.