Convert cartridge output voltage to db gain


Happy holidays everyone. I hope that you may help me with a problem. I have re-configured my system, preamp and amp gone, I'm now using a integrated amp. I still have my phono stage and cartridge. My cartridge has a 0.24mV output, my phono stage has 66 db of gain. This used to sound fine, but now I notice that the noise floor is too high for me. So I'm debating on whether to look for a higher gain phono stage, or more likely, a higher output cartridge.

So now my question, how much more output would give me how much more gain? Should I be looking at a 0.5, 1.0 or 2.0+ mV output cartridge? I think I need at least 10 db more gain, and there are not many 76db+ phono stages out there. So what do you analog experts think? Is there any table out there that can show me how to convert voltage output to gain increase? TIA.

Cheers,
John
128x128jmcgrogan2
Another factor to consider is that the .24 mV cartridge may in fact have an output slightly higher than this. Both of my 103R's, spec'd at .25 mV came with data sheets showing testing on the individual cartridges and outputs higher than .25 mV. The one that is closest at hand shows an output on one channel of .31 mV and .32 on the other.

If this is the case with the OP's cartridge it would be an argument for possibly further reducing the gain in the phono stage.
Hdm -- If the preamp section of the integrated amp is introducing a certain amount of noise into the signal path, how is reducing the signal level that is going into that amp going to help? Answer: It will not, it will make things worse, because the ratio of signal to noise will be degraded.

The statement that the higher gain of the new amp is exacerbating noise resulting from a cartridge/phono stage mismatch makes no sense. The signal level, noise level, signal-to-noise ratio, and distortion level of what is being fed by the cartridge + phono stage into the new amp is the same as what was being fed into the prior preamp + amp. If the new amp provides higher gain than the prior preamp + amp, the volume control would be adjusted down correspondingly, with no difference in the resulting noise EXCEPT for the difference in s/n performance of the new amp compared to the prior preamp + amp.

Regards,
-- Al
I'm sorry Kal, but I tend to agree with Al here. I'm already listening with the knob set at 2 o'clock on the VAC, I can't understand how lowering the cartridge voltage and/or phono stage gain would accomplish anything but make me turn the volume knob up to 4-5 o'clock. If the problem is the VAC preamp tubes this wouldn't seem to solve anything as far as I can understand.

I've owned this cartridge/phono preamp combination for about 3 years now. It was dead quiet with my ARC Ref 3 preamp. It was slightly noiser when I switched to the VAC Avatar Super (possibly due to the extra 11 db of gain) which I've been using for the last 15 months or so. I didn't listen to any LP's for about a 8-9 month time frame, and now the noise floor is much higher with the same players. I still think it's probably either the DC offset in the phono stage or a bad tube in the preamp stage of the VAC. It's certainly easier and cheaper to fiddle with the phono preamp and the preamp tubes than to buy another cartridge or phono preamp.

I guess as an answer to my original question, there is no table that shows how to convert cartridge output voltage in db gain, but you did show me a formula to use, and I thank you for that.

Thanks again, and happy holidays to all.

Cheers,
John
Thank you Darkmoebius. Using this calculator, it looks like I would need a phono stage with 80 db of gain or a cartridge with a 1.25 mV output to match the 2.5 V output of my CDP in SPL.

My guess was at least 10 db difference, but now I know that there is a 14 db difference, which explains the different settings on the volume control knob.

Still awaiting Lloyd's reply. I guess with the holidays and all I shouldn't hold my breath.

Cheers,
John