Since I've actually been down this road before with design calculations and a prototype . . . here are some of the main issues you're likely to face:
The main issue is that of headroom -- this is very critical when feeding an ADC because of its hard-limit clipping . . . mid-band modulation peaks of +15dB (relative to 5 cm/sec velocity) are common in commercially produced records. Now, a well-designed analog RIAA preamp can be set up to have a similar amount of headroom at 20KC as at 1KC, but this is of course impossible if the EQ is done completely after conversion. Further, with a MM/MI cartridge, the load/cable capacitance causes an ultrasonic peak, which compensates for the natural HF rolloff of the cartridge. This peak can easily be 5-10dB at 25-30KHz . . . and overloading an ADC at the very top of the audioband brings out the very worst aspects of its performance, with big-time aliasing an intermodulation components being common.
So if you add together 20dB for the EQ, 10dB headroom for HF peaking, and 15dB for common modulation peaks, this means that if you set 0dB/1KC at -45dBFS into the ADC, you still have very little real-world headroom. So bring it down only another 5dB for good measure, and most of the mid-band modulation is at -50dBFS, which leaves only 50dB or so S/N on a really good 24-bit ADC . . . assuming the mic preamplifier you're using is perfect and noise-free.
Now if the mic preamp is designed for a low-impedance balanced microphone, then its input En/In characteristics are going to be a marginal match (at best) for an inductive MM/MI cartridge, even if you've adjusted its loading. And since you've gone through the trouble of adapting a mic preamp (assuming you've removed loading resistors, phantom-power blocking caps, input-pad resistors and switches, etc. and added an appropriate loading network for the cartridge) . . . wouldn't it simply be easier to adjust the input impedance and capacitance of you existing phono preamps to whatever you want?
In the end, in my pursuits I found that I could get much better performance with a well-designed two-stage RIAA preamp and a typical ADC eval board. To further pursue digital RIAA compensation, I concluded that it would be best to use a preamp with a fixed single-pole (6dB/octave starting at maybe 15Hz) compensation across the entire audioband, and applying only the precision compensation within the digital domain - this preserves both good headroom and noise characteristics, and makes the analog EQ completely non-critical, as then its tolerances only slightly affect level, not frequency response. Also interesting to me was the idea of using an MM/MI cartridge loaded by an I/V converter into an ADC, thus eliminitaing the effects of any amount of cable capacitance. The equivalent of an appropriate load capacitor could be then applied with DSP equalisation after conversion.
Anyway, just a few thoughts . . . good luck.
The main issue is that of headroom -- this is very critical when feeding an ADC because of its hard-limit clipping . . . mid-band modulation peaks of +15dB (relative to 5 cm/sec velocity) are common in commercially produced records. Now, a well-designed analog RIAA preamp can be set up to have a similar amount of headroom at 20KC as at 1KC, but this is of course impossible if the EQ is done completely after conversion. Further, with a MM/MI cartridge, the load/cable capacitance causes an ultrasonic peak, which compensates for the natural HF rolloff of the cartridge. This peak can easily be 5-10dB at 25-30KHz . . . and overloading an ADC at the very top of the audioband brings out the very worst aspects of its performance, with big-time aliasing an intermodulation components being common.
So if you add together 20dB for the EQ, 10dB headroom for HF peaking, and 15dB for common modulation peaks, this means that if you set 0dB/1KC at -45dBFS into the ADC, you still have very little real-world headroom. So bring it down only another 5dB for good measure, and most of the mid-band modulation is at -50dBFS, which leaves only 50dB or so S/N on a really good 24-bit ADC . . . assuming the mic preamplifier you're using is perfect and noise-free.
Now if the mic preamp is designed for a low-impedance balanced microphone, then its input En/In characteristics are going to be a marginal match (at best) for an inductive MM/MI cartridge, even if you've adjusted its loading. And since you've gone through the trouble of adapting a mic preamp (assuming you've removed loading resistors, phantom-power blocking caps, input-pad resistors and switches, etc. and added an appropriate loading network for the cartridge) . . . wouldn't it simply be easier to adjust the input impedance and capacitance of you existing phono preamps to whatever you want?
In the end, in my pursuits I found that I could get much better performance with a well-designed two-stage RIAA preamp and a typical ADC eval board. To further pursue digital RIAA compensation, I concluded that it would be best to use a preamp with a fixed single-pole (6dB/octave starting at maybe 15Hz) compensation across the entire audioband, and applying only the precision compensation within the digital domain - this preserves both good headroom and noise characteristics, and makes the analog EQ completely non-critical, as then its tolerances only slightly affect level, not frequency response. Also interesting to me was the idea of using an MM/MI cartridge loaded by an I/V converter into an ADC, thus eliminitaing the effects of any amount of cable capacitance. The equivalent of an appropriate load capacitor could be then applied with DSP equalisation after conversion.
Anyway, just a few thoughts . . . good luck.