Major help needed in Input and output impeadance..


Okay, so if you have a 10 kohm or 10,000 ohm input in amplifier terms, what are the parameters for the output on the preamp? I see anything ranging from 50 ohm to 600 ohm in preamps outputs. And see many amplifiers from 10 kohm , 20 kohm, 30 kohm, 50 kohm etc... I know there is a rule of thumb but what is weird Is most of the manufactures I see building amps with 10 kohm up to like 50 kohm are building preamps with only about 50 ohm outputs.... nothing near what some other manufactures use between 220 ohm to about 750 ohm.

I will have a 600 ohm out pre soon and my amps are only 10,000 ohm inputs. so what does this mean? it almost seems that the 600 ohm may not be compatible in looking at the matching most manufactures use as they are all much lower output impeadance in general vs. my preamp, so what happens, I get less gain, more noise, what can be a problem if you have mismatched impeadances between these two components? Or is all this irrelevant and I just need to get the preamp and listen and not worry about it? But I swear I read something about this topic at some time and really want to make sure I know what I am doing.

Thanks, Ps. in my post I did use K- for increments of 1000 if anybody got confused when I wrote out the whole number and then started using the abbreviation with 'K'
matrix
You optimize voltage transfer if the preamp output impedance is minimal and the amps input impedance is maximal. However, there are other problems when these are pushed to extremes (noise/RF pickup, etc.) and, once you get near two orders of magnitude difference, further improvements are asymptotic. (600ohms into a 10k load should be OK as long as the preamp has adequate current.)

Some manufacturers don't rely on voltage transfer but, instead, are current-based and these depend on matching the input and output impedances.

Kal
I could be wrong.But I'm more concerned with the actual gain and output voltage of the preamp. The lower output impedance seems normal to me. Your preamp should do just fine. I always equated the preamp's output impedance with how well it could handle the impedance of the interconnects or load of the amplifier's input.The lower the output impedance, the less the length of the interconnect becomes factor in how it performs. I would think too much gain or output voltage could be more of a problem.You either have a hair trigger volume or one that has to be turned up more than half way before you reach a comfortable volume.

If my amplifier is of high gain say 25 or 30dB and it only takes 1 volt to reach full power. I wouldn't want a preamp with 30dB gain that claims 15 volts output. This would be like using a Megaphone while talking on a microphone..I think you get my drift ;-).Like I said I could be wrong but this is what I look at.Someone more technically inclined than myself can put it in engineering terms.
Hmm well the amp has no spec. on the input gain shown, but the pre amp is from 2 volt to 10 volt output, so if I have a 2 volt cd player I would guess I get 2 volt out of the pre, maximum out would be 10 volt, so in my case I do have a 4 volt out CD player so I assume it will pass 4 volt down to the amp... Now I would guess the only way to drive the pre to its full 10 volt out would be something like a cd player with 4 volt into an equalizer or something that boosts the signal another 6 volts and then you would be at max, so I am not worried about the preamp being overdriven really cause no source I own or even seen sold would put out that kinda voltage in general maybe some phono stage?, but it does not have the gain spec. shown on the preamps papers either, but I would guess it is 20 db gain stage.

It does have the signal noise ratio of 112db which is pretty good from what I understand, but I dont know this spec. tells us anything, also it is capable freq. range of 10hz to 200,000 hz, so its a high end piece, not the standard 20hz to 20,000 hz most are specked at, again this all may mean nothing?
Not necessarily so Matrix.The standard output voltage of most cd players is 2 volts. The standard input voltage is 1.5 to 2 volts on most amplifiers. I've seen amplifiers that need 4 volts to get full power from the unit(the designer claimed quieter operation this way). You should be able to get 10 volts from the preamp using a 2 volt source without problems.The preamp may start at just a 10th of a volt and go up from there on the volume control.You really should be fine with the preamp. As far as the crazy specs go.IMHO the 200 khz is pretty meaningless. Most of us male humans can't hear past maybe 18khz ..I dont know though some audiophiles do claim to hear like bats ..LOL.

The variation of dbs from point to point would also have more meaning for me.For example frequency response from 10Hz to 20khz +/- 3db or better is very good.Some may claim +/- 1 dB variations..it doesn't get much better than this. Most highend speakers don't achieve such a flat response anyway. The only commercial speakers I've seen claim close to this are the Green Mountain Audio speakers(after hearing them I believe it to). Even then once you get in the lower bass region the variations still drop to +/- 3dB..which ain't bad at all. There should be a certain amount of rolloff to get the correct decay and timbre of certain instruments IMHO.