Digital cable v. standard interconnects

This may sound somewhat naive, but what is the difference between a "digital coax cable" and standard RCA-type interconnects? Is it truly necessary to connect a digital source with a digital cable, or will any decent IC work as well?
With my $3k system (all used .. about a $6k system new) I could not hear any difference between a Canare digital cable and a good quality home brew RCA interconnect. I even could not hear a difference when I swapped in one of the interconnects which came with my marantz CD player.

so I happen to think that in an inexpensive system, like mine, that digital cables don't matter. Save your money for the analog interconnects ... that's where the difference is.
I'm no techie, but a true digital cable is supposed to have a uniform impedance of 75ohms from tip to tip. Few, if any cables actually achieve this spec, which probably accounts for the wide variation in sound and opinions as to whether this matters or not. Some analog interconnects, when used as a digital cable, may brighten or darken the sound and can negatively affect the soundstage. If you want to play it safe and minimize your investment, you can get a half way decent canare digital cable for around $20 on Ebay.
No digital cable with RCA termination will exhibit true 75 ohm (nominal) impedance. RCA connectors were designed for audio rather than radio frequency (RF) use. They do not exhibit a constant 75 ohm impedance at the RF frequencies used by the digital signal.

Ideally the digital cable, processor, and transport would use BNC connectors. Some manufacturers (i.e. Wadia) supply their gear with this connector in place of the RCA.