HDMI cable made a difference in picture quality??


I just bought an a 42 inch LG LCD flat screen TV which is has full HD function. I have a HD cable box and opted to use RCA cables betwwen the box and TV. I am getting missed opinions from the cable provider and the LG dealer about the usefulness of the HDMI cable hookup. Does it actually improve picture quality??? If so,how much? and will it also improve non-HD programming??? I have seen several adds from Monster Cable and Audioquest touting their HDMI cables. Monster has one that is about $70; another is about $100. I have seen even higher price tags from Audioquest. All comments welcomed. Thanks Jim
sunnyjim
The only way to really test two HDMI cables to see what cable is really better is to freeze frame the HD picture and then compare the two cables looking at the same still picture. Then you can clearly see fine details missing in that still picture and faded looking colors too when you compare Wegrzyn's pure solid silver HDMI cable to other HDMI cables on the market.....
If you freeze frame, it would cause the digital source to output the exact same data per each frame per second. So if there is distortion caused by cable not up to specification, radom noise would be introduced in the form of digital artifacts. They would be so obvious as they would not look like as if there is a digital filter applied to the picture to make it look faded or details missing. Plus, since the data is constant, any artifact would not appear still, as the data stream is constant, any noise would be randomly injected.

Also, HDMI data is checksumed, see HDMI specification:

www.hdmi.org/pdf/HDMISpecInformationalVersion.pdf
Toufu, if that were the true then the Wegrzyn HDMI cable would look the same too...faded and missing details .. ???
Post removed 
Hifisoundguy, it would not be likely for digital artifacts to show up as faded or missing details.

For analog tranmission, yes, since the output of analog is a direct product of the signal being transmitted and if cable is suspect, then result picture/sound can appear as if there were filter applied to make the picture look distorted, faded, etc.

But for digital, the analog is first encoded to digital and then the digital bits are converted to analog to transmit the signal, then the analog signal gets picked up and converted back to digital and finally to analog again. So any noise in the transmission would not appear uniform such as that the noise introduced fades the picture.

Think about back in the days of analog TV signal, if you don't have clear signal, your picture looks distorted, but at least in a recongnizable way. But if you have an HD antaena to pick up digital signal, you either get a picture or you don't. Or if it locks on a weak signal, you can see the digital artifacts very clearly (blocks of green or bad pixels).