Bob,
Some people argue that you are getting a better maintained signal when you use a better cable. Some people also argue that bits aren't bits and that signal degradation is going to affect the sound even if the bits are still recognizable as ones and zeros. My point is, it doesn't matter even if you believe those arguments. If the cable follows HDMI standards, which it doesn't have to be expensive to do, then it will get the signal there unless the cable is broken. The signal is regenerated by the processor chip cache and/or memory buffers, so the attenuation issue some people consider a problem is a non-issue as long as the cable is following the standards (sizing/gauge/etc).
If the bits aren't getting to the receiver, then the cable is broken or is not adhering to standards... Paying more for a cable isn't going to prevent the possibility of getting a malfunctioning cable. If you have a broken cable, return it.
Perhaps there isn't an implicit error correction for HDMI itself. I do know that the signal isn't processed when the cable is malfunctioning with certain media, but with other media some kind of error detection/correction must be occurring. I've had one malfunction and then break on me due to mishandling - I tried to snake it through the wall one too many times. Ironically enough, it was a relatively expensive cable (over ~$60). Signal passing worked on that cable with some material and not with other material.
I know with CDs there are multiple error checks when the disc is read. If you are sending data via bit stream to a preprocessor to have it decode certain compression algorithms, I believe the decoding process error checks the stream as well when converting it to PCM. Am I 100% certain of which algorithms perform error correction or not? Not off the top of my head. There are kinds of error correction that don't require a resending of information (redundancy checks), they are usually built into the stream in the form of some kind of checksum or the stream itself is framed a certain way to provide a checksum. These checks take place via decoding software.
CD ROMs have multiple redundancy checks, and so do Internet protocols (at more than one layer of the TCP-IP stack, as well as between larger ISP trunks using proprietary signaling frames).
I used bad terminology by referring loosely to the underlying digital signal and HDMI as one and the same...
If you are looking for more detail than that, you are going to have to research it yourself. I am taking four engineering classes ATM as prerequisites for an MS program in an engineering field I want to pursue (I had a test on Monday in Calculus 3, a test last night in Engineering Physics, and a test this morning in Statics...). This leaves me little time for leisure. I don't feel like wasting too much of that time researching topics in which I have only a passing interest. I would be interested in what you find out though. :D
Some people argue that you are getting a better maintained signal when you use a better cable. Some people also argue that bits aren't bits and that signal degradation is going to affect the sound even if the bits are still recognizable as ones and zeros. My point is, it doesn't matter even if you believe those arguments. If the cable follows HDMI standards, which it doesn't have to be expensive to do, then it will get the signal there unless the cable is broken. The signal is regenerated by the processor chip cache and/or memory buffers, so the attenuation issue some people consider a problem is a non-issue as long as the cable is following the standards (sizing/gauge/etc).
If the bits aren't getting to the receiver, then the cable is broken or is not adhering to standards... Paying more for a cable isn't going to prevent the possibility of getting a malfunctioning cable. If you have a broken cable, return it.
Perhaps there isn't an implicit error correction for HDMI itself. I do know that the signal isn't processed when the cable is malfunctioning with certain media, but with other media some kind of error detection/correction must be occurring. I've had one malfunction and then break on me due to mishandling - I tried to snake it through the wall one too many times. Ironically enough, it was a relatively expensive cable (over ~$60). Signal passing worked on that cable with some material and not with other material.
I know with CDs there are multiple error checks when the disc is read. If you are sending data via bit stream to a preprocessor to have it decode certain compression algorithms, I believe the decoding process error checks the stream as well when converting it to PCM. Am I 100% certain of which algorithms perform error correction or not? Not off the top of my head. There are kinds of error correction that don't require a resending of information (redundancy checks), they are usually built into the stream in the form of some kind of checksum or the stream itself is framed a certain way to provide a checksum. These checks take place via decoding software.
CD ROMs have multiple redundancy checks, and so do Internet protocols (at more than one layer of the TCP-IP stack, as well as between larger ISP trunks using proprietary signaling frames).
I used bad terminology by referring loosely to the underlying digital signal and HDMI as one and the same...
If you are looking for more detail than that, you are going to have to research it yourself. I am taking four engineering classes ATM as prerequisites for an MS program in an engineering field I want to pursue (I had a test on Monday in Calculus 3, a test last night in Engineering Physics, and a test this morning in Statics...). This leaves me little time for leisure. I don't feel like wasting too much of that time researching topics in which I have only a passing interest. I would be interested in what you find out though. :D