Hi Lew,
I doubt that its possible to generalize in a meaningful way, as every design is different, and has its own set of tradeoffs.
One thing that would seem safe to say, though, is that the signal-to-noise ratio of the signals that are ultimately presented to the speakers, and hence the amount of background hiss that is heard, cant be any better than what it is at the front end of the signal path, which is to say the ratio of the output voltage of the cartridge to the noise that is present in the circuitry at the front end of the phono stage (aside from common mode noise that may be rejected if the phono stage is balanced). Noise that is present at that point will, along with signal, be amplified by every amplification stage that follows, and the signal level at that point will be lower than at every subsequent point in the chain.
A key factor with respect to your question, that I dont have a specific feel for, is how much variation there will tend to be in the S/N performance of an adjustable gain phono stage as its gain is adjusted. My guess is that in general if the gain setting is increased by X db, while remaining within reasonable bounds relative to the cartridge output, the S/N performance of the phono stage will degrade by considerably less than X db, and perhaps not at all in some cases. That would be consistent with your observations concerning background noise, because if the gain setting can be increased without significant S/N degradation at that point in the signal path, the lessened significance of noise generated by downstream circuit stages, between that point and the volume control (relative to the increased signal level at those points), might result in a net improvement in S/N. It would also lessen the impact of noise that may be picked up at the interface between the phono stage and the preamp, as a result of ground loop or RFI/EMI effects.
But of course it might be a completely different story if what is being compared are DIFFERENT phono stages, whose gains also differ by X db, but whose S/N performances are not similar.
As far as dynamics are concerned, a number of additional unpredictable variables may come into play. One of those is the distortion performance of the various circuit stages in the chain, and how that distortion performance is affected by signal level. You may have seen Ralph (Atmasphere) comment in the past, in a different context (that of SET amplifiers), that since the 5th, 7th, and 9th harmonics of a note's fundamental frequency are significant determinants of our perception of loudness, an increase in those distortion components that occurs primarily on high volume transients will result in a subjective perception of increased dynamics. Since line level and phono level stages almost always operate Class A, and consequently there is no crossover distortion that would assume greater significance as signal level decreases, it seems possible that the effect he described could occur in those stages, as a result of the increase in non-linearity that may occur at high signal levels. So in some cases an increase in perceived dynamics might be the result of low level odd harmonic distortion produced by the circuit stages preceding the volume control, when those stages are asked to handle higher level signals as a result of a gain increase further upstream.
Perhaps Jonathan or Ralph will comment further on your question, as Im sure they could speak to it more knowledgeably than I can.
Best regards,
-- Al