Signal degration over length, source vs. output ?


Which signal is more likely to degrade over length, source to amp, or amp to speaker? I have a pair of 300B monos on the way and don't know where to place them relative to preamp and speakers. Is it generally better to place the amps close to the pre-amp and use a short interconnect and a longer speaker cable or the other way around? Any insight would be appreciated.
jamesddurkin
The lengths you describe are not a real concern either way but I still prefer shorter speaker cables.

The reason for my preference is that the connection between the preamp and amp is impedance-defined for the transfer of voltage and, assuming decent cable and appropriate impedances of the preamp out and power amp in, the only loss will be voltage. That is easily compensated for by a little touch on the VC.

The amp to speaker interface needs to transfer power (voltage and current) and the input of the speaker (almost any of them) varies widely in impedance. Increasing cable length increases resistance and reduces power transfer but also does so in a manner that varies with frequency due to the varying load.

All that assumes near ideal equipment and really long runs. Your setup is not stressful.
You might just try it both ways. If you notice no differance, then just do what is convenient. having said that, I agree with the above post. 8 to 10 feet is no big deal either way.
Kr4: Since your concern is with loss of power transfer, which do you think "loses" more power / signal ?

A) A long run of 20 gauge or thinner wire ( as commonly found in most interconnects )

OR

B) A long run of 14 gauge or heavier wire ( as commonly found in most speaker cables )

As far as the "voltage losses" that can be compensated for with the volume control, that is not just "voltage" that you've lost, it is a dynamic part of the signal. Since the losses are most likely to take place when the least amount of signal / voltage is present, the likely effect is that one will lose low level information. This results in the masking of subtle details. While some may mistake this loss of signal or noise transfer as a reduction in the systems' noise floor i.e. a "blacker background" due to NO noise or signal being present at very low levels, it is in all actuality, a reduction in resolution and dynamic range.

Obviously, there are pro's and con's to each method. If your system is carefully thought out and uses conductors that are suitable for passing the quantity of signal that will be in operation without incurring measurable amounts of series resistance, chances are, either method will work "okay". Sean
>
We have seen gain in our long cable runs; Why? Because signal degeneration is mostly a product of field contamination and poor resonance tuning NOT CABLE LENGTH. Remenber: alternating current is a complimentary technology that's in diametric opposition. Think outside the box!
Actually, my concern is the harmonic change due to very long speaker cables rather than power or voltage losses. In addition, living with a 10meter run for one or the other, I have experimented with both and found my preference. Of course, I generally use a preamp with a <50ohm output impedance and excellent voltage output.

In the OP's case, (and indeed in any other), best advice is to try both ways if one can.