Current mode vs voltage mode


Will some kind audiophile, in relatively simple terms, explain the difference between current mode and voltage mode signal transfer.

The reason I'm so interested is I own a Halcro amp/preamp combo which sounds much too thin when connected with high quality XLR voltage/standard interconnects. However when I use the unbalanced current mode connection (with standard high quality RCA cables) between amp and preamp it sounds much more musical and the lean qualities near-completely disappear.

Thanks for all thoughts in advance

Brian
audiobrian
Post removed 
Thanks Bob

Your informative post is very helpful. I'm not sure if impedances are the entire story, however, as what I am reading in the Halcro manuals is very similar to Krell's description of their CAST/KCT technology with "infinite" output impedance on their preamp current outputs and minimal, under 100 ohm, input imedances on their amp current inputs. Or is the Krell current tunnel technology nothing but a choice of impedances to simulate a tube amplifier?......sorry if my electrical background is lacking.

Brian
Sean or anyone? I had thought this thread was a question about current vs. voltage based amps. I don't see the type of I/C used as relating to how the amplification is accomplished.
Sort of like Nelson Pass's amps are current driven whereas most are voltage driven, if I'm not mistaken.
Maybe now this is off-topic but since you are posting, any insights on advantages of current vs. voltage driven amps?
Are chip amps (National LM3875) also voltage driven? Can't figure out why they sound so good.
Post removed 
Hi Bob, -you got it backwards- the higher output impedance amplifiers behave as 'current sources' rather than 'voltage sources'.

Actually the two concepts are a bit misleading and counter-intuitive. A low impedance source is so because it can make the same voltage into any load. So it is a 'voltage source'. Its often said that this type of amplifier is 'load impervious'. Myself I think that there is a lot of mythology involved: for example such an amp if it does say 100 watts into 8 ohms, will do 200 into 4 ohms and so on. Conversely, it will do 50 watts into 16 ohms and 25 into 32. This is typical of a lot of transistor amplifiers, and you can see why it might not work on all speakers, ESLs for example (which often have higher impedances at low frequencies). The mythology comes in when it is said that the amp is thus 'load impervious'!

A 'current source' amplifier will *attempt* to make constant power, not constant voltage, into these same impedances. If a speaker is designed for a 'voltage source', it may not have flat frequency response with the 'current source' (even though it may sound better in other ways). OTOH, speakers that expect 'current source' characteristics (i'e. most but not all tube amps) will have flat frequency response despite being the exact same technology.

Its all in what the speaker designer expects of the amplifier. Since they are expecting different things. it can be a bit tricky (and expensive!) to make the right choice.

Sorry for the winded explanation.