Matching digital input impedance of dac with cable


Is this important? The input impedance of the dac i am considering is 75 ohm. I want to use cables with an impedance of 110 ohm. Is this a problem? Can someone please explain why matching or why not matching these impedances is important?

Thank you
128x128tboooe
It's about reflections within the 'transmission line' of the cable. These reflections can occur at any junction in the connection (connectors, solder joints etc,), but in this case we're concerned with the junctions we can control, the connector junction. These reflections can then arrive at varying times during the transmission of the data, possibly (and that's a point of some debate) interfering with the receiving device's ability to recognize the exact point at which the waveform transitions from it's 0 or 1 positions.
How are you planning to use a cable designed for 110 ohm use for a 75 ohm connection? Adapters? If so, my personal experience says that's a bad idea sonically.
Just my two cents.
Thanks tplavas...that was question. Do I need to match the impedance of the dac and the cable...from what I understand of your response I do need to match the impedance...
Since we're on the subject, I've seen DACs with 75 ohm inputs & others with 110 ohm inputs. Is there a performance difference between the two other than the 110 ohm input is easier to drive for a transport?