Dual Differential / Balanced?


Hey all I’ve got that itch to upgrade power amps, and was wondering how valid the dual differential aka "balanced" monoblock or dual mono design is in terms of increasing fidelity compared to a conventional SE amp. note my preamp is also fully balanced

how much noise is avoided by using a fully balanced system?

right now I use 2 haflers horizontally biamping NHT 3.3. using mogami gold XLR
p4000 200wpc mids/highs p7000 350wpc lows

from what I’ve read it only matters if both the preamp and power amp are both truly balanced

I have a nice Integra Research RDC 7.1 fully balanced pre/pro, it was a collab with BAT, I would go for the matching RDA "BAT" amp but its pretty much unobtanium

So far I’ve looked at classe ca200/201, older threshholds, older ksa krell, as fully balanced monoblocks/ dual mono stereo

I was also told to look at ATI amps, they look very impressive but expensive

I’m looking to spend 1500-2500 preferably used products, I dont have an issue with SE amps I just want to exploit the fact my pre is fully balanced, and perhaps get better sound. If anyone has recommendations for awesome dual differential power amps. the NHT 3.3 are power hungry so at least 150wpc, class A/AB

I’ve also come across the emotiva XPA-1 monoblock, I can get a good deal on one of them I wonder if its worth picking this up and praying for a lone one to come on classifieds on ebay- note this is the older model in the silver chassis 500wpc 8ohm / 1000 4ohm

for context prior to the realization that I should use a fully balanced system I was looking at brystons and mccormack amps.. thanks
nyhifihead
Well Ralph your  explanation of misuse /improper application of "balanced " circuits would explain why I don't usually find them better sounding in many audio systems. Nonetheless single ended /unbalanced components can  often sound fabulous. Superior implementation of either circuit type is mandatory. 
Charles, 
Thank you Ralph, I will keep reading more.  I use fully balanced stages in low level amps without any ground reference.  It has bandwidth of few kHz only, but full scale of the signal is in single millivolts.  At the end A/D converter with differential input provides its own ground reference.  I can see a problem with fully balanced audio amp design that has no reference to ground.  Both outputs can be floating together since without output current or voltage difference feedback won't react to that.  It needs ground reference somewhere or some kind of servo on common mode. 

Shield should never be used to carry signal but it was unfortunately common practice long time ago.  Scope's coax is a good example of that.  Scope with shorted leads, touching circuit under test, shows phony signal - since ground return path (possibly thru supply) causes current flow thru the shield, that input amp (referenced to scope's BNC GND) sees as a signal (voltage drop on the shield).  Shield converts common mode to fake normal mode signal.

My small Rowland has only XLR inputs - perhaps mature decision in class D amp, so I was not able to compare it with single ended RCA cable.  I still can see substantial sound quality difference between decent XLR cable (AQ King Cobra) and very good one (AZ Absolute). 

I have to read AES48 standard, you mentioned.  There was wonderful EDN magazine issue on that many years ago. Grounding and shielding is considered by many as black magic.


Kkijanki,
A friend of mine has the Atma-Sphere  MP-1 preamplifier and the MA-1 amplifiers. With this fully differential balanced  signal path he says he can hear  differences between various brands of balanced  interconnects.  By the way his MP-1 sounds terrific mated to my Coincident 300b SET amplifier. 
Charles, 
Charles & Kijanki, see Ralph’s first post dated 3-22-2013 in this thread for a summary of what is necessary to make a balanced line-level interface insensitive to cable differences. And also for what I consider to be a compelling proof Ralph offers for his contention.

But also note a question I posed a few posts later in that thread, and the answer Ralph provided:
Almarg 3-27-2013
... to eliminate interconnect cable differences is it necessary that the component output actually BE driving a low load impedance (2000 ohms or less, to use your figure), or is it just necessary that it be CAPABLE of doing that?

Atmasphere 3-27-2013
You touched on an interesting point about the load vs the capability to drive that load. I have found that the capacity to drive the load plays a huge role, about 80% but for that last bit of cable artifact to go away the load must be there as well. Flipping the coin over- if the capacity to drive the load is absent, then its moot and there will be cable colorations.
I see that the MA-1 amplifier which Charles mentioned has a specified balanced input impedance of 200K. I’d imagine that one reason Ralph would have chosen to not provide a very low input impedance, such as the 600 ohm standard he refers to, is that most preamps made by other manufacturers would not be able to drive such a load with good results.

Best regards,
-- Al

Thank Al, 
So very specific requirements are necessary in order for Ralph's point to be valid,  I appreciate the clarification. This would explain the findings of my friend's Atma-Spheres and his various balanced cables. 
Charles,