RCA to XLR adapters?


I currently have a complete single end (RCA) input system but was possibly looking at other amps that are fully balanced and accept only balanced or XLR connections. My question is will these adaptors give you the full benefit of a balanced amp or preamp? Or will it simply was work ok? I have expensive cable that I will not replace but is terminated with RCA's.
bobheinatz
From my experience with Cardas XLR to RCA adapters, it greatly degrades the sound. If you have a very high end system, I suggest only use in an emergency or temporary.
The only time I tried SE into balanced, the amp did not respond well at all, and it was fully differential. It made sound, but badly.

Many find that they don't need expensive XLR cables because the effects (or is it affects) of the cable get reduced. You didn't mention if your preamp has XLR outs.
One note of caution: Not all Balanced amps are created equal.

I put this very question to ARC's tech team some years back and learned -

ARC's balanced amps (according to ARC) will perform "very poorly" if you try this scheme. I'm not particularly technically oriented, but my understanding is that the lack of noise cancellation in the single ended signal will trip the protection circuitry at less than half the rated output. My 122 WPC VT30SE will output +/- 50 WPC (IIRC) before the protection police shut down the party.

Disclaimer: This posting is based on my recollection of a coversation with ARC's tech guy many years back, so I hope I got it right and that this explanation makes sense to those more knowledgeable of the issues at hand.

Marty
If the amplifier is truly balanced, it will not care if the input signal is balanced or single ended, nor will it matter if you use the RCA or XLR for that input.

I can imagine an amplifier that is balanced but does not take advantage of all the possibilities that balanced operation offers, but the idea that using a single-ended source with it might set off the protection circuit seems weird. If that is really true then there is a problem in the amplifier that needs to be fixed; I can't think of a reason why a properly operating amp would behave that way.

Now when you attempt to run a balanced amp with a single-ended source, care must be taken that the hookup is correct. Often its a good idea to make sure pin 3 and pin 1 (ground) are tied together. If this is not done, pin 3 can float and inject noise into the amplifier. This might be why we see a few weird responses in the posts above (although I would have expected that this sort of nuance would have been dealt with, but maybe not...). The Cardas adaptors we have seen do not do anything with pin 3 so they usually need some sort of attention, depending on the setup.
I assume that is the architecture used in Ralph's designs. So the question becomes whether or not the input stage of all or at least most other "fully balanced" amps can be expected to have been designed in an equivalent manner. I don't know the answer to that question.
Al, you're exactly correct -- the design of what constitutes a "balanced" input stage, and a "balanced amplifier" circuit, varies wildly among different manufacturers and circuit designers.

There *should* really be two different, independent aspects to "balanced" equipment design -- the first being the rejection of noise as a result of equipment interconnection, the second being any performance benefits/demerits of differential circuit operation. From what I've seen in high-end audio, designers and engineers get these two goals confused the overwhelming majority of the time.

I know you're familiar with Whitlock's papers on the subject of balanced interconnection - these give excellent insight into the issues of balanced input-stage design and how common-mode rejection is dependent on the balance of impedance (NOT voltage!) presented by the line driver and cable/connectors. Basically, the input stage's tolerance to source imbalance is a function of the ratio of the common-mode and differential-mode impedances. That is, for a given differential-mode (signal) input impedance, the higher the common-mode impedance . . . the greater tolerance the balanced input has for source impedance balance errors. This is how a balanced input circuit with a high common-mode impedance (i.e. a transformer) can indeed give substantial rejection of ground noise from an unbalanced source, provided it's wired with a properly-terminated balanced adapter cable.

Where the necessity for balanced voltages comes into play is for the rejection of even-order distortion products -- this occurs by enforcing symmetry in the circuit's transfer function . . . and the majority of designers confuse this goal with that of interconnection noise rejection.

A common example is this notion of a "balanced amplifier" being simply two amps stuck in a chassis wired to different pins of an XLR connector. In this instance (as you correctly postulated) any voltage imbalance between the two amplifier stages will disturb the necessary null for cancellation of even-order harmonic distortion products. And this voltage imbalance will be affected by any impedance OR voltage imbalances in the preceeding equipment, cables, or the input termination resistances . . . not to mention any differences in gain, distortion performance, or bandwidth between the two amplifier stages. It's even common to see this approach on with two otherwise conventional solid-state power-amp circuits, which makes very little sense at all . . . because the differential input stage alone (with high open-loop gain) is very effective at eliminating even-order distortion products, and the load impedance is then effectively halved across the amplifer output, making the output stage much less linear (everything else being equal).

I think it's exceptionally bad form to design a preamp or amplifier in this manner, but it's alarmingly ubiquitous . . . I guess because this idea of "balanced" is so in vogue right now. There's also a common variation on this where the two amplifier stages share a common differential feedback ladder (like an "instrumentation" op-amp), making the common-mode gain unity, and (hopefully) substantially lower than the signal gain. This latter topology can be somewhat successful, but IMO it's still a bad choice to rely solely on this for input-stage noise rejection . . . the noise is still present to a degree through the entire circuit, and can intermodulate with the signal to a degree dependent on the circuit's linearity.

I believe Ralph's amps are based on differential amplifiers for each stage, and this is one (rare) instance where a single approach can work reasonably well for both interconnection noise rejection and even-order distortion cancellation. Here, the tolerance of the amplifier to source impedance imbalances is mainly the choice and tolerance in input termination resistors, and the tolerance of source voltage imbalances is a function of the transconductance of each stage, and the matching of certain circuit elements (i.e. input triodes and plate-load resistors).

So the short answer to the original poster's question is . . . how well the amplifier works with a simple adapter is dependent on how well the amplifier is designed . . . and as always, simply having a brand name with "audiophile-cred" doesn't mean it's well-designed.