Hi Axel,
Its not that I don't like the differential approach, in fact my linestage is designed this way. But I will confess that I'm especially proud of the bypassing scheme - there are separate, unambiguous AC return paths for both differential signal current and signal current that flows to ground (i.e. from an impedance imbalance in the output cable or the amplifier that follows it).
Some would not agree with this necessarily, since other then in digital designs, harmonic distortion is never buried in the noise floor completely.Sure it is - just about any datasheet-derived single-NE5534 phono stage will do it, for reasonable signal levels and output current. I obviously don't feel that such circuits are ultimate expression of what's possible in a phono preamp, but the only reason to tolerate measureable harmonic distortion in these circuits is if the designer feels that there are benefits to choosing certain types of parts or topologies - and these choices make it impossible to eliminate distortion. I have no problem with that . . . every designer is free to decide what parameters meet their goals - ultra-low distortion just happens to be one of mine.
It seem current understanding that harmonic distortion should rather RISE evenly (even- and odd-order equally) with increasing output, rather than then decreasing with higher output. (Output rise as from cart input rise)I believe that what you're referring to is a specific type fault that many feel can occur as a result of crossover distortion in Class B power amplifiers. Again, a phono preamplifier can and should be completely free from these types of anaomolies.
I is VERY difficult if not impossible to prevent some potential differences occurring in components (and amongst each other) during all states of operation. So they best possible directed by use of e.g. star-ground schemes. The point is, there are still caps (to ground) involved and caps have power factors, creating a far less 'clean' signal path then the dedicated (-) in a balanced design.If you re-consider Kirkhoff's laws and look at the signal CURRENT, it always must flow between the power-supply rails, period. The purpose of local bypassing capacitors is twofold - first, to remove the effects of power-supply wiring and traces from the circuit, and second, to prevent different stages' current draw from affecting each other. These functions are necessary in both differential and non-differential circuits, and regardless of whether or not the signal VOLTAGE is defined in relation to ground, at least a portion of the signal CURRENT will always flow through the bypass capacitors . . . and that's the way its supposed to be. It is up to the designer to keep these signal and supply currents separate from each other, regardless of whether or not they flow through a node we call "ground". In fact, it can sometimes be more of a problem in differential circuits, where each side of the circuit has separate bypass capacitors to ground . . . in which the signal current has to flow through a minimum of two capacitors and a ground trace to return to the supply.
Its not that I don't like the differential approach, in fact my linestage is designed this way. But I will confess that I'm especially proud of the bypassing scheme - there are separate, unambiguous AC return paths for both differential signal current and signal current that flows to ground (i.e. from an impedance imbalance in the output cable or the amplifier that follows it).
The trade-off is most always balanced = more dynamic, and 'cleaner' vs. unbalanced = better harmonic completeness, more natural sounding.I firmly believe that as we improve our art . . . it IS possible to have all of what you describe, without tradeoffs. And not everybody will see it the same way, but hopefully we will all end up with a more fulfilling experience from our recorded music.