I do not understand technically why a particular preamp may have a greater capability to drive a low impedance amplifier.
While,
I try to follow these types of discussions, something basic is lacking
in my current understanding as well as how it impacts how it sounds..
Could you perhaps elaborate further or direct me to another thread where I am sure that this has been covered before?
No need- I can explain it.
There are two factors, output impedance and low frequency cutoff. They are related.
First is output impedance and generally to avoid distortion, the amplifier should have an input impedance 10x greater than the output impedance of the preamp. That's easy enough- any preamp mentioned so far does that.
The second bit- low frequency cutoff- is a measure of the value of the output coupling capacitor of the preamp vs the input impedance of the amp. The two together form a timing constant. Here's the formula:
F = 1,000,000 / C x R x2Pi
F is frequency, C is microfarads and R is resistance. Normally the formula looks a bit different (1 divided by the other factors) but since microfarads is a convenient capacitive value I adjusted it.
So if we have a 10uf capacitor driving a 10K load at the input of an amplifier, plugging in the values we see that:
1.59 = 1,000,000/ 10 x 10,000 x 6.28
IOW it will be 3dB down at 1.59Hz.
The problem is that a 10uf cap has colorations, even if its the best Teflon cap money can buy. But otherwise this cutoff frequency is good. But a preamp manufacturer has to weigh options and one of them might be that they want it to sound better with their own amps. So they might limit the value of the coupling capacitor- and thus increase the cutoff frequency into lower impedances. IOW they sacrifice bass response and impact for greater transparency. But if their ideal amp has a high input impedance they might not be sacrificing any bass at all.
Now if you can find a review of a preamp that graphs its output impedance vs frequency, you can see the effect of the coupling capacitor- the output impedance rises as you approach the cutoff (-3dB) point. This shows up in a lot of tube preamps so you can see that many manufacturers regard 10K as perhaps not worth sacrificing the transparency they get with a smaller coupling cap.
Because our preamp has a direct-coupled output, our output impedance curve is identical to the frequency response curve. With capacitor coupled preamps this isn't the case. One further thing of note- its a good design practice to set the -3dB point at to at least 1/10th the lowest frequency you want to play, so 2Hz if you want to be good to 20Hz. This insures that there will be no phase shift at 20Hz, which gives you better bass impact. So this is part of the issue- getting that solid bass all the way down. You need that margin in order to avoid artifacts caused by phase shift. IOW if you want proper bass at 20Hz, your preamp should be able to go to flat down to 2Hz while driving the input of your amplifier.