A/D input levels with 24-bit - need to max out?


Hi,

I use a USB Pre A/D converter with a Grado PH1 phono stage to record vinyl records in 24-bit 44.1Hz stereo. I'm using Bias Peak to record. I use Cool Edit and ClickFix to process. Back in Peak again, I raise the gain to the highest unclipped level before dithering to 16 bit to make CD's or FLAC archives.

I record a lot of records these days, so to avoid re-recording due to clipped levels on the USB Pre, I set the levels conservatively. As a result, the peak for some records might be 80% or so instead of 95% more. I do try to record loud records in a set, etc., and adjust the level accordingly. But the Pre's level knobs are tricky to set evenly by eye (wide-spaced markers) so I may adjust downward more than I would otherwise.

My rationale has been that raising the input level doesn't help much, since I'd also be raising the noise level of the phono stage and any noise generated by the analog portion of the converter and the signal to noise ratio would not change.

Now, I may be answering my own question here, but if I understand digital audio correctly, more bits are given to louder passages, so by keeping the level lower, I'm getting fewer bits for the quiet parts than I should be getting.

But, since I'm recording at 24-bit anyway, before editing and then dithering (to 16 bits) after raising the gain, does it really matter that much? The 16-bit result has the highest level sound, it's just the 24-bit initial recording that is lower.

Thanks in advance for reading this and any advice you might have.

gritingrooves
gritingrooves
Shadorne...It's not just about dynamic range. The size of the digital steps (quantization) is the LSB (Least Significant Bit). Even if a program is loud all the time (little dynamic range) the accuracy with which the digital data represents the analog waveform depends on the quantization.
Ok, folks, while some of you feel I don't need to worry, given 24-bits, with maximizing the gain, the rest of you maintain that it still matters.

I'm sure it does from a purist's point of view. If I was recording one or two records, I'd do it as many times as necessary to get close to max without clipping, but I have a tone to do, so I can't afford to be too picky. ;-)

That was the point of this query. I want to verify that if I'm down a few db, 24 bit will make up for it a little.

I know that when, after cleaning clicks, etc., I raise the gain (using Bias Peaks anti-clipping tool), I'll just be padding with zeros.

What is your opinion about what happens with I dither down to 16?
What is your opinion about what happens with I dither down to 16?

In theory this will increase the noise floor but I expect your data will already have such a high noise floor due to the analog source (as I eplained above there us no way it exploits 24 bits of dynamic range or 144 DB). So in practice the difference between your 24 bit and 16 bit data should be totally inaudible. 16 bit 44.1 KHz was well chosen 30 years ago - it is perhaps one of the reasons SACD has struggled.