Unbalanced to balanced VPI junction box change?


I have a VPI classic 1 with Valhalla wiring. Currently have a rca junction box and considering changing for XLR. My only reservation is needing to upgrade wiring along with the junction box. For my taste an approximate $1000.00 upgrade. I love the sound now, would be worth upgrading?
128x128harris4crna
Al has got that right. In addition Jonathan Carr's input mirrors my own experience in the design of phono preamps.

**As a general rule of thumb** if you can hear differences between balanced cables it means that some aspect of the balanced standard is not being supported/observed by the equipment involved.

I don't mean to derail the thread, but to add to Al's comment about high output cartridges (moving magnet), they are high enough impedance such that the cables play a role, not only that but I have yet to find one that loads properly at 47K. Usually to get them to sound right the actual correct load impedance is much lower- for example on many Grados the right load impedance seems to be around 8K-12K.

Sorry for the derail.
Harris , you would be surprised the benefits you would get from a decent isolation platform for your table like from Symposium and others.
Atmasphere.....I have a totally balanced system (all Ayre components) and can readily hear the difference in cables....
Converting to balanced is a no-brainer. Easy to DIY and plenty of adapters available if you don't know which end of a soldering iron to hold.
I have a totally balanced system (all Ayre components) and can readily hear the difference in cables....

I have a lot of respect for Ayre, but regardless if you can hear differences between the cable, some aspect of the balanced line system is not being supported and is allowing the cables to manifest some artifact.

Here are the standards:

1) pin 1 ground, pins 2 and 3 carry the signal out of phase with each other. In the US pin2 is non-inverting.

2) The signal occurs between pin 2 and 3; pin 1 is ignored and is only used for shielding.

3) The cable will have a twisted pair for the signals, within the shield.

4) the connection will be low impedance (LOMC phono is a good example- quite frequently the cartridge sees 100 ohms or less at the input of the preamp) such that the source can drive 600 ohms without loss of bandwidth.

It is items 2 and 4 where most high end audio products don't adhere to the standard. The reason this standard was created BTW was to eliminate interconnect cable interaction with the sound. Think about recordings made in the 1950s and you will see what I mean- quite often in these recordings the microphone signal had to travel up to 200 feet before it arrived at the tape recorder, yet obviously as we can hear the signal somehow arrived in good condition. This was entirely due to the use of the balanced standard.

The implication here of course is that the cost of the cable has nothing to do with how it sounds.

I have always thought that audiophiles would be interested in a means to get the interconnect cables to not 'editorialize' upon the audio signal. You would be surprised how difficult it can be to get across what the benefits are. For example the length of the cable or lack of it has no bearing in the benefits derived by being balanced.

At any rate, if you can hear differences in cables as you have mentioned, it does in fact mean that the equipment is not supporting the standard.