Convert cartridge output voltage to db gain


Happy holidays everyone. I hope that you may help me with a problem. I have re-configured my system, preamp and amp gone, I'm now using a integrated amp. I still have my phono stage and cartridge. My cartridge has a 0.24mV output, my phono stage has 66 db of gain. This used to sound fine, but now I notice that the noise floor is too high for me. So I'm debating on whether to look for a higher gain phono stage, or more likely, a higher output cartridge.

So now my question, how much more output would give me how much more gain? Should I be looking at a 0.5, 1.0 or 2.0+ mV output cartridge? I think I need at least 10 db more gain, and there are not many 76db+ phono stages out there. So what do you analog experts think? Is there any table out there that can show me how to convert voltage output to gain increase? TIA.

Cheers,
John
128x128jmcgrogan2
Correction. The last paragraph of my previous post should have read:

Too much gain is only an issue if the volume control ends up being used near the bottom of its range, or if the input of a component is overloaded, neither of those situations being present here, or if the additional gain comes at the expense of degraded s/n ratio in the particular design (which may be the case here).

Regards,
-- Al
Al: In this case, lowering the cartridge output (or reducing the gain for that matter) would indeed result in John having to run the volume pot higher with phono, but as you point out in your last paragraph that is only an issue if he's running full out on the volume pot of the integrated and that is (and IMO won't be the case with a reduced gain of 3 db at the phono stage) not the case.

IME, the noise or hiss that he describes is totally consistent with too much gain at the phono stage. In effect, the phono stage is either on the verge of being overloaded, or is being overloaded by the cartridge which has too much output for the amount of gain on tap. That manifests itself in the hiss which he's describing as well as a bit of hardening up in sound quality and a collapse of soundstage, particularly front to back.

Not having enough gain can also create noise problems as well as totally squashing dynamics and giving the effect of listening through about 3 wet blankets.

Particularly with low output MC's having outputs in the .15 to .35 mV range, dialing in the gain is critical. Get 3 db in either direction on the wrong side at the phonostage and you have problems, 4-5 db wrong or more and it's not worth listening to. And those problems cannot be rectified further on down the line than the phonostage.

Changing tubes may indeed help. I don't have much experience with tubes to be candid. But I would expect that if that was to help that the tubes themselves might, in fact, be altering the overall gain that the phono stage is ultimately providing.

I would be curious to hear what Lloyd Walker would recommend in this situation, but I do think it is a gain issue and that the phono stage has too much gain for the cartridge being used.
Another factor to consider is that the .24 mV cartridge may in fact have an output slightly higher than this. Both of my 103R's, spec'd at .25 mV came with data sheets showing testing on the individual cartridges and outputs higher than .25 mV. The one that is closest at hand shows an output on one channel of .31 mV and .32 on the other.

If this is the case with the OP's cartridge it would be an argument for possibly further reducing the gain in the phono stage.
Hdm -- If the preamp section of the integrated amp is introducing a certain amount of noise into the signal path, how is reducing the signal level that is going into that amp going to help? Answer: It will not, it will make things worse, because the ratio of signal to noise will be degraded.

The statement that the higher gain of the new amp is exacerbating noise resulting from a cartridge/phono stage mismatch makes no sense. The signal level, noise level, signal-to-noise ratio, and distortion level of what is being fed by the cartridge + phono stage into the new amp is the same as what was being fed into the prior preamp + amp. If the new amp provides higher gain than the prior preamp + amp, the volume control would be adjusted down correspondingly, with no difference in the resulting noise EXCEPT for the difference in s/n performance of the new amp compared to the prior preamp + amp.

Regards,
-- Al
I'm sorry Kal, but I tend to agree with Al here. I'm already listening with the knob set at 2 o'clock on the VAC, I can't understand how lowering the cartridge voltage and/or phono stage gain would accomplish anything but make me turn the volume knob up to 4-5 o'clock. If the problem is the VAC preamp tubes this wouldn't seem to solve anything as far as I can understand.

I've owned this cartridge/phono preamp combination for about 3 years now. It was dead quiet with my ARC Ref 3 preamp. It was slightly noiser when I switched to the VAC Avatar Super (possibly due to the extra 11 db of gain) which I've been using for the last 15 months or so. I didn't listen to any LP's for about a 8-9 month time frame, and now the noise floor is much higher with the same players. I still think it's probably either the DC offset in the phono stage or a bad tube in the preamp stage of the VAC. It's certainly easier and cheaper to fiddle with the phono preamp and the preamp tubes than to buy another cartridge or phono preamp.

I guess as an answer to my original question, there is no table that shows how to convert cartridge output voltage in db gain, but you did show me a formula to use, and I thank you for that.

Thanks again, and happy holidays to all.

Cheers,
John