The idea of driving a cartridge directly into the virtual ground of an amp either just using the amp input impedance (such as a grounded base transistor) or via a resistor is hardly a new one. Some of the earliest solid state phono stages did exactly that, including one that I sold in the UK in the 1970s. I also used a transimpedance op amp that I designed (the AD846) in that mode- using the device as a current conveyor and operating it both closed and open loop as the extraordinarily high impedance "compensation node" could be loaded by a resistor to provide a fixed, and low, transimpedance for the stage.
I can't say that either approach seemed to be particularly successful.
A good way to view this is to simulate the response of the current from a cartridge model loaded in exactly this way, which basically means reducing the load R to whatever the amp input impedance is and measuring the current through that R- the assumption being that the current through the load R is what enters the ideal current conveyor.
It should be immediately obvious that the signal current is just whatever the voltage is across the resistor divided by the value of the resistor so it's just a scaled version of whatever voltage the original voltage amp saw and there is no difference in the output!
So, all we need do in the original design is to reduce the cartridge load R further from the 100 ohms and see what happens.
Let's return to our initial case- the one with 100pF, not the one that is "optimized" with a much larger cap- clearly as the resistor falls in value the effect of the cap is reduced so it seems like a good place to start.
Remember at 47k load the response is extremely flat in the audio band but has a screaming peak at c. 700kHz.
To get a decent noise performance a bipolar input stage needs to run at least at 1mA current, and lets also assume that the input is complementary- NPN and PNP transistors with the emitters connected, both in a common base configuration- then to a first order the input resistance is about 10 ohms.
Under these circumstances the frequency response of the input current or voltage for our 5ohm 5mH cartridge is down c. 13dB at 20kHz! That doesn't seem so sensible to me.
The reason for this should be obvious. The generator output impedance is dominated at HF by the winding inductance so it increases c. linearly with frequency beyond the Rint/ ZLint point which in our case is 2*p1*5/.0005=2000pi=c.2.8kHz!
As far as the cartridge is concerned it can't tell whether the load it sees is into a common base configuration with zero dc offset, or the same load into ground. By the way, the DC offset needs to be zero. Running DC current into a cartridge is just not a good idea...
You could reduce the input bias current or add an extra R to make the load resistance go back to 100 ohms- but why is it different from the case with the voltage amp?
Yes, common base stages are different insofar as the stage is "broadbanded" compared to a common emitter transistor stage, and the collector base capacitance is not multiplied by the collector- base voltage gain (Miller capacitance) but I don't really see why that is a big deal- indeed if you really care then just cascode the input stage and reduce the Miller capacitance that way- something that is often done anyway as it improves the bandwidth/linearity of the input stage.
I can't say that either approach seemed to be particularly successful.
A good way to view this is to simulate the response of the current from a cartridge model loaded in exactly this way, which basically means reducing the load R to whatever the amp input impedance is and measuring the current through that R- the assumption being that the current through the load R is what enters the ideal current conveyor.
It should be immediately obvious that the signal current is just whatever the voltage is across the resistor divided by the value of the resistor so it's just a scaled version of whatever voltage the original voltage amp saw and there is no difference in the output!
So, all we need do in the original design is to reduce the cartridge load R further from the 100 ohms and see what happens.
Let's return to our initial case- the one with 100pF, not the one that is "optimized" with a much larger cap- clearly as the resistor falls in value the effect of the cap is reduced so it seems like a good place to start.
Remember at 47k load the response is extremely flat in the audio band but has a screaming peak at c. 700kHz.
To get a decent noise performance a bipolar input stage needs to run at least at 1mA current, and lets also assume that the input is complementary- NPN and PNP transistors with the emitters connected, both in a common base configuration- then to a first order the input resistance is about 10 ohms.
Under these circumstances the frequency response of the input current or voltage for our 5ohm 5mH cartridge is down c. 13dB at 20kHz! That doesn't seem so sensible to me.
The reason for this should be obvious. The generator output impedance is dominated at HF by the winding inductance so it increases c. linearly with frequency beyond the Rint/ ZLint point which in our case is 2*p1*5/.0005=2000pi=c.2.8kHz!
As far as the cartridge is concerned it can't tell whether the load it sees is into a common base configuration with zero dc offset, or the same load into ground. By the way, the DC offset needs to be zero. Running DC current into a cartridge is just not a good idea...
You could reduce the input bias current or add an extra R to make the load resistance go back to 100 ohms- but why is it different from the case with the voltage amp?
Yes, common base stages are different insofar as the stage is "broadbanded" compared to a common emitter transistor stage, and the collector base capacitance is not multiplied by the collector- base voltage gain (Miller capacitance) but I don't really see why that is a big deal- indeed if you really care then just cascode the input stage and reduce the Miller capacitance that way- something that is often done anyway as it improves the bandwidth/linearity of the input stage.