How to go from RCA to XLR?


I've got an Aragon Stage One processor with RCA outputs and two Aragon Palladium 1K monoblocks with XLR inputs. I know there are a lot of RCA-XLR cables available, but a fabricator told me you have to know which XLR pins are "hot" and these have to match the amps' input circuitry or you will damage the amp.
So how do you know which pins to make hot when you order the cables? When you buy these cables "off the shelf" are you just hoping you get lucky and they match up with your equipment?
Thanks
noslop
Man you guys really get into it. Its great. I'm an M.D. so my knowledge of electronic design and theory is quite limited. I'll have to read it all again a few times to even begin to understand all that's been said. But in medicine and all science, empiric evidence trumps all. So here's what I found empirically. When I hooked up a balanced Aragon Aurum pre amp to the Palladiums, I couldn't turn the volume to half before my Martin Logans were so loud, it hurt. With the Stage One's gain at max (there's a gain control in the advanced settings), and the volume at max, the speakers were loud,(sort of), but nowhere near as loud or impressive as when the balanced Aurum was hooked up.So I'd have to conclude that the amps were, indeed, being hobbled by the RCA to XLR conversion.
That being said, anyone want to buy a slightly used Stage One, cheap?
Noslop -- Thanks for the witty presentation of your empirical findings! :)

But I don't think that they necessarily support the conclusion that the rca-to-xlr conversion is what is responsible for the huge volume difference that you heard between the two configurations.

Even if the power amplifier architecture corresponds to what Rwwear had suggested, which is the worst case in terms of the possible effects of the conversion on the amp's power output, there would only be a 6db loss in maximum output power. That represents a 75% reduction in maximum amp output power, which in subjective terms is not anywhere near the difference you describe, imo.

Putting db losses into perspective, it is commonly said that a 10db loss (which is a loss of 90% of the maximum output power) corresponds to a subjective perception of "half as loud." What you are describing sounds like a greater loss than even that, given that with the balanced preamp you had the volume control at less than 1/2 scale, while with the unbalanced one you had it at max.

So something else is going on, obviously related to the extra variable of having two different preamps in the comparison. Perhaps their gains are different, or perhaps something in the complex set of settings of the Stage One is confusing things.

I think that a better test would be to compare balanced vs. unbalanced outputs of the Aurum, with the unbalanced Aurum output going into the power amp through the adapter.

To summarize what has been discussed as to the different possible effects of the adapter on the power amplifier output, depending on its architecture:

If the power amp architecture is as I and Atmasphere envisioned, there would be no sacrifice in maximum output power by feeding it single ended through the adapter; it would just be necessary to raise the volume control 6db higher to reach that maximum output power (compared to balanced drive).

If the power amp architecture is as Rwwear envisioned, then the maximum output power of the amplifier (the point at which it would clip) would be reduced by 6db (compared to balanced drive), and that reduced clipping point could not be overcome by the volume control.

I think that the fact that you were apparently unable to clip the power amplifier when driving it with the adapter, even with the volume control at max, reinforces my view that your findings so far are inconclusive.

Regards,
-- Al
If as I think the amps are bridged, what happens to the channel that isn't available when using the RCA adapters? It would be half of the amp's power being sent to ground.
If as I think the amps are bridged, what happens to the channel that isn't available when using the RCA adapters? It would be half of the amp's power being sent to ground.

That was what I was addressing in my previous post when I said:

If the power amp architecture is as Rwwear envisioned, then the maximum output power of the amplifier (the point at which it would clip) would be reduced by 6db (compared to balanced drive), and that reduced clipping point could not be overcome by the volume control.

Actually, 3/4 of the amp's maximum power output would be lost (a 6db reduction), not 1/2 (a 3db reduction). One side of the amp would swing to the output voltage it is supposed to. The other side of the amp, instead of swinging to the same voltage but with the opposite polarity (minus instead of plus, or vice versa), would be at 0. So the net voltage difference across the speaker that is connected between the two outputs would be half of what it is supposed to be. Half the voltage = 1/4 of the power = -6db.

If an attempt were made to overcome that loss by turning up the volume control, then on loud peaks of the music, that would cause the amp to have to output more power than that reduced amount, the sound would clip/breakup/distort. Noslop did not report that he heard any distortion, just that the volume was lower, which tells me that something else was going on, such as his unbalanced preamp having less gain than his balanced preamp.

It's important to keep in mind that reduced power capability does not in itself mean reduced volume (although there would be reduced volume as well, by 6db). A moderate loss in volume, such as 6db, can be overcome by turning up the volume control a bit. What a loss of output power capability does is to reduce the volume level at which clipping/breakup/distortion occurs.

Regards,
-- Al