Question About Audio Research Preamp Specs


Hi,

I am trying to understand Audio Research terminology while looking at their product specs.

For example, for the LS28 OUTPUT IMPEDANCE, it reads "600 ohms Balanced, 300 ohms SE Main (2), 20K ohms minimum load and 2000pF maximum capacitance..."

What does the  "20K ohms minimum load" tell me?

I am running a McCormack DNA-500 and it has an input impedance of 10K Ohms. It would seem that the amp would be compatible with this preamp based on the 10X/15X minimum rule of thumb. 

My concern is that the AR spec is telling me use amps with input impedance of 20K ohms and the DNA would not work well.

Thanks for listening,

Dsper
dsper
georgehifi,

"Great idea, tell them you want the "uF" of coupling cap increased so you can have "little or no" rolloff above 20hz into your amps 10kohm input impedance."

georgehifi, You certainly have given me some discussion points with CJ and what is possible to accomplish. Thank you!

Why would CJ not have done this in the first place? Maybe a cost/benefits decision considering how relatively few amps have 10K ohm impedance?

Or could this possibly change/degrade the overall sound of the preamp?

I assume my last question may illustrate how little I know about this technical stuff so any additional comments would be greatly appreciated!

Thanks for listening,

Dsper
Atmosphere,

" But obviously the Aesthetix faces the same issue of what is the lowest impedance it can actually drive actually; this information does not show up in their specs or operating manual."

I do not understand technically why a particular preamp may have a greater capability to drive a low impedance amplifier.

While, I try to follow these types of discussions, something basic is lacking in my current understanding as well as how it impacts how it sounds..

Could you perhaps elaborate further or direct me to another thread where I am sure that this has been covered before?

Thanks for listening!

Dsper
Why would CJ not have done this in the first place? Maybe a cost/benefits decision considering how relatively few amps have 10K ohm impedance?

It wasn't common to have such low 10k input impedance poweramps in the past, as the loosely based standard was 47k or 50k and many especially tube were 100k.
  
Ever since Class-D has reared it's ugly head has 10kohm become more apparent, which means it's not suited to many tube preamps and also many passives.

I believe there should be one standard and that's 100kohm then anything can drive it, hell Rogue Audio use 1megohm on their poweramps and they don't have any problems at all.

Cheers George 
I do not understand technically why a particular preamp may have a greater capability to drive a low impedance amplifier.

While, I try to follow these types of discussions, something basic is lacking in my current understanding as well as how it impacts how it sounds..

Could you perhaps elaborate further or direct me to another thread where I am sure that this has been covered before?
No need- I can explain it.
There are two factors, output impedance and low frequency cutoff. They are related.
First is output impedance and generally to avoid distortion, the amplifier should have an input impedance 10x greater than the output impedance of the preamp. That's easy enough- any preamp mentioned so far does that.
The second bit- low frequency cutoff- is a measure of the value of the output coupling capacitor of the preamp vs the input impedance of the amp. The two together form a timing constant. Here's the formula:
F = 1,000,000 / C x R x2Pi

F is frequency, C is microfarads and R is resistance. Normally the formula looks a bit different (1 divided by the other factors) but since microfarads is a convenient capacitive value I adjusted it.
So if we have a 10uf capacitor driving a 10K load at the input of an amplifier, plugging in the values we see that:
1.59 = 1,000,000/ 10 x 10,000 x 6.28
IOW it will be 3dB down at 1.59Hz.

The problem is that a 10uf cap has colorations, even if its the best Teflon cap money can buy. But otherwise this cutoff frequency is good. But a preamp manufacturer has to weigh options and one of them might be that they want it to sound better with their own amps. So they might limit the value of the coupling capacitor- and thus increase the cutoff frequency into lower impedances. IOW they sacrifice bass response and impact for greater transparency. But if their ideal amp has a high input impedance they might not be sacrificing any bass at all.

Now if you can find a review of a preamp that graphs its output impedance vs frequency, you can see the effect of the coupling capacitor- the output impedance rises as you approach the cutoff (-3dB) point. This shows up in a lot of tube preamps so you can see that many manufacturers regard 10K as perhaps not worth sacrificing the transparency they get with a smaller coupling cap.

Because our preamp has a direct-coupled output, our output impedance curve is identical to the frequency response curve. With capacitor coupled preamps this isn't the case. One further thing of note- its a good design practice to set the -3dB point at to at least 1/10th the lowest frequency you want to play, so 2Hz if you want to be good to 20Hz. This insures that there will be no phase shift at 20Hz, which gives you better bass impact. So this is part of the issue- getting that solid bass all the way down. You need that margin in order to avoid artifacts caused by phase shift. IOW if you want proper bass at 20Hz, your preamp should be able to go to flat down to 2Hz while driving the input of your amplifier.





I really like the matchup with Ayre  and ARC maybe because of the high ratio  between input and output impedence.