For example, my Neumann microphones are set up to drive 150 ohms. What this means is that if you don't load them at 150 ohms (if instead you have a load of 1000 ohms or higher), the output transformer will express the inter-winding capacitance rather than the turns ratio, and you will get coloration and no bass.Atmasphere, the vast majority of Neumann mics have an output impedance of 50 ohms, and they've traditionally specified a loading of >1K. It's only some of the classics (i.e. U67) that are switchable to lower impedance, and they still need to be loaded at at least 300 ohms.
DIY balanced interconnects
I want to build some balanced interconnects.
1. Has anyone compared Swithcraft, Vampire and Neutrik XLR plugs?
2. Any comments on Mogami Neglex 2534 vs Vampire CCC-II vs Oyaide PA-02 cables.
3. Should the ground shield on these twinax cables be connected on both ends, only on the source ends, or on the preamp ends?
Thanks for your comments.
1. Has anyone compared Swithcraft, Vampire and Neutrik XLR plugs?
2. Any comments on Mogami Neglex 2534 vs Vampire CCC-II vs Oyaide PA-02 cables.
3. Should the ground shield on these twinax cables be connected on both ends, only on the source ends, or on the preamp ends?
Thanks for your comments.
- ...
- 39 posts total
Right, Kirkus, my Neumanns are U67s. After doing some measurements with them, we found that the transformer was more linear when driving a lower impedance- even 600 ohms was too high. The 600 ohm standard arose from the characteristic impedance that was created by spaced telegraph cables, and was later adopted by the phone companies, later still by recording and broadcast equipment manufacturers. These days that standard is considered obsolete, 1K and the like are common input impedances, as a result the standard is somewhat diluted. We wanted to be assured that our gear would support existing hardware that it might get connected to, as there is still quite a bit of collectible tube equipment out there like Ampex tape machines that are on the same standard. |
To sort of sum up what I have been trying to say here. if you want to have the cables not be a part of the overall system sound, the old 600 ohm standard is the way to do it. The audio world in general has seen a shift from what I call the Power Paradigm to the Voltage Paradigm (see http://www.atma-sphere.com/papers/paradigm_paper2.html for more info) and the dilution of the 600 ohm standard in balanced operation is an example. Under the old power rule, the effect of the cable could be neglected. Under the newer Voltage rule as Kirkus has mentioned, the cables have an increasingly audible effect as the load impedance is increased. The question is whether an audiophile would want the cable to have an audible artifact or not. I would prefer that it not, thus I use the 600 ohm standard as that is the technology that was created to solve the problem decades ago. Building more expensive cables to me does not look like a solution. |
Atmasphere, I like your paper . . . it's interesting and eloquent; a good read. And I understand how it's an alluring perspective, especially as far as speaker impedance is concerned, for the manufacturer of an amplifier with a high output impedance. It's just too bad that the historical data doesn't support it - but we're never going to convince each other the opposite, so I'll drop it. But maybe you could shed some light on why, if you're advocating a power-transfer approach (as is common on video and RF transmission systems), you're not using something like 110 - 150 ohms (source and load), the intrinsic impedance of a typical balanced interconnect? Because that's how a power-transfer system is supposed to work, no? |
Kirkus, if you are referring to characteristic impedance, we in fact examined that issue about 20 years ago. The fact of the matter is that if you can determine the actual characteristic impedance of the cable and then terminate it correctly, the result is quite spectacular. The problem is, you need a Time Delay Reflectometer or the like to make the determination- in practice a bit impractical. So you have no standard termination value as characteristic impedance can vary quite a bit due to minor changes in the construction and materials of the cable. This is likely why the industry settled on 600 ohms decades ago. It clearly was not accident. In practice, the 600 ohm standard works quite well and since it was already in place, it seemed to be a matter of picking one's battles. As it is, the fact that our preamp is balanced has been its single biggest marketing problem, so in retrospect I'm glad its not been any more complicated that it already has been :) I would have loved to have more input when we were setting a lot of this up but, but at the time we were the only ones that cared about balanced operation. I can find plenty of historical evidence to support my paper BTW; the Radiotron Designer's Handbook, published by RCA is a good place to start. If you wish to discuss this further that would be a topic for another thread or email. In fact I would love to do that, the fact of the matter is I've been looking for someone who can rebutt the document ever since it was written (about 3 years ago). No-one has been able to do that so far. BTW, its not about having a tube amp with a high output impedance, its about whether you use feedback or not- IOW obeying the rules of human hearing or not. |
- 39 posts total