RCA to XLR adapters?


I currently have a complete single end (RCA) input system but was possibly looking at other amps that are fully balanced and accept only balanced or XLR connections. My question is will these adaptors give you the full benefit of a balanced amp or preamp? Or will it simply was work ok? I have expensive cable that I will not replace but is terminated with RCA's.
bobheinatz
If the amplifier is truly balanced, it will not care if the input signal is balanced or single ended, nor will it matter if you use the RCA or XLR for that input.

I can imagine an amplifier that is balanced but does not take advantage of all the possibilities that balanced operation offers, but the idea that using a single-ended source with it might set off the protection circuit seems weird. If that is really true then there is a problem in the amplifier that needs to be fixed; I can't think of a reason why a properly operating amp would behave that way.

Now when you attempt to run a balanced amp with a single-ended source, care must be taken that the hookup is correct. Often its a good idea to make sure pin 3 and pin 1 (ground) are tied together. If this is not done, pin 3 can float and inject noise into the amplifier. This might be why we see a few weird responses in the posts above (although I would have expected that this sort of nuance would have been dealt with, but maybe not...). The Cardas adaptors we have seen do not do anything with pin 3 so they usually need some sort of attention, depending on the setup.
I assume that is the architecture used in Ralph's designs. So the question becomes whether or not the input stage of all or at least most other "fully balanced" amps can be expected to have been designed in an equivalent manner. I don't know the answer to that question.
Al, you're exactly correct -- the design of what constitutes a "balanced" input stage, and a "balanced amplifier" circuit, varies wildly among different manufacturers and circuit designers.

There *should* really be two different, independent aspects to "balanced" equipment design -- the first being the rejection of noise as a result of equipment interconnection, the second being any performance benefits/demerits of differential circuit operation. From what I've seen in high-end audio, designers and engineers get these two goals confused the overwhelming majority of the time.

I know you're familiar with Whitlock's papers on the subject of balanced interconnection - these give excellent insight into the issues of balanced input-stage design and how common-mode rejection is dependent on the balance of impedance (NOT voltage!) presented by the line driver and cable/connectors. Basically, the input stage's tolerance to source imbalance is a function of the ratio of the common-mode and differential-mode impedances. That is, for a given differential-mode (signal) input impedance, the higher the common-mode impedance . . . the greater tolerance the balanced input has for source impedance balance errors. This is how a balanced input circuit with a high common-mode impedance (i.e. a transformer) can indeed give substantial rejection of ground noise from an unbalanced source, provided it's wired with a properly-terminated balanced adapter cable.

Where the necessity for balanced voltages comes into play is for the rejection of even-order distortion products -- this occurs by enforcing symmetry in the circuit's transfer function . . . and the majority of designers confuse this goal with that of interconnection noise rejection.

A common example is this notion of a "balanced amplifier" being simply two amps stuck in a chassis wired to different pins of an XLR connector. In this instance (as you correctly postulated) any voltage imbalance between the two amplifier stages will disturb the necessary null for cancellation of even-order harmonic distortion products. And this voltage imbalance will be affected by any impedance OR voltage imbalances in the preceeding equipment, cables, or the input termination resistances . . . not to mention any differences in gain, distortion performance, or bandwidth between the two amplifier stages. It's even common to see this approach on with two otherwise conventional solid-state power-amp circuits, which makes very little sense at all . . . because the differential input stage alone (with high open-loop gain) is very effective at eliminating even-order distortion products, and the load impedance is then effectively halved across the amplifer output, making the output stage much less linear (everything else being equal).

I think it's exceptionally bad form to design a preamp or amplifier in this manner, but it's alarmingly ubiquitous . . . I guess because this idea of "balanced" is so in vogue right now. There's also a common variation on this where the two amplifier stages share a common differential feedback ladder (like an "instrumentation" op-amp), making the common-mode gain unity, and (hopefully) substantially lower than the signal gain. This latter topology can be somewhat successful, but IMO it's still a bad choice to rely solely on this for input-stage noise rejection . . . the noise is still present to a degree through the entire circuit, and can intermodulate with the signal to a degree dependent on the circuit's linearity.

I believe Ralph's amps are based on differential amplifiers for each stage, and this is one (rare) instance where a single approach can work reasonably well for both interconnection noise rejection and even-order distortion cancellation. Here, the tolerance of the amplifier to source impedance imbalances is mainly the choice and tolerance in input termination resistors, and the tolerance of source voltage imbalances is a function of the transconductance of each stage, and the matching of certain circuit elements (i.e. input triodes and plate-load resistors).

So the short answer to the original poster's question is . . . how well the amplifier works with a simple adapter is dependent on how well the amplifier is designed . . . and as always, simply having a brand name with "audiophile-cred" doesn't mean it's well-designed.
Kirk, thanks very much for another of your exceptionally knowledgeable big-picture perspectives, which certainly makes a persuasive case for the conclusion expressed in your last paragraph.
The Cardas adaptors we have seen do not do anything with pin 3
It amazes me how so many manufacturers, both pro-oriented and audiophile-oriented, seem to have it backwards when it comes to rca-to-xlr adapters. Rca to xlr-female adapters, which would be used on xlr outputs, should leave pin 3 unconnected, but almost invariably ground it to pin 1. Rca to xlr-male adapters, which would be used on xlr inputs, should ground pin 3 to pin 1, yet here we have an example of a respected high-end manufacturer leaving pin 3 open.

Best regards,
-- Al
I think most of the responders here have missed the Question . Bobheinatz wants to use a single ended ( 2 wire cable ) in a balanced set up , Which will require a 3 wire cable . You can use adapters or reterminate , but you won't get a balanced signal with a 2 wire cable .
12-16-10: Tmsorosk: I think most of the responders here have missed the Question. Bobheinatz wants to use a single ended ( 2 wire cable ) in a balanced set up , Which will require a 3 wire cable . You can use adapters or reterminate , but you won't get a balanced signal with a 2 wire cable.
The discussion has focused mainly on the EFFECTS of putting an unbalanced signal into a balanced amp, so I don't think the question has been missed.
12-15-10: Kirkus: A common example is this notion of a "balanced amplifier" being simply two amps stuck in a chassis wired to different pins of an XLR connector. In this instance (as you correctly postulated) any voltage imbalance between the two amplifier stages will disturb the necessary null for cancellation of even-order harmonic distortion products. And this voltage imbalance will be affected by any impedance OR voltage imbalances in the preceeding equipment, cables, or the input termination resistances . . . not to mention any differences in gain, distortion performance, or bandwidth between the two amplifier stages.
I think it's also worth noting that using an adapter to put an unbalanced signal into that kind of design would result in up to 75% of the amplifier's power capability being unable to be utilized, since the voltage swing capability between the two output terminals would be cut in half.

Best regards,
-- Al