Can I convert stereo to mono for a phono input


My preamp doesn't have a switch to convert stereo to mono.  I would like to switch a stereo signal to mono to set my cartridge azimuth (when using two channels out of phase, this method allows accurate balancing of the channels).  Is there a way to build a simple converter: two RCA female plugs taking a stereo signal and mixing it to mono, output as two RCA male plugs (mono signal)?


I have a test LP that provides a stereo track (test signal) with the two sides out of phase.  All I would need to do is feed that through the mono converter to set my azimuth. 

Thanks for any advice.  Peter

peter_s
It’s not the cartridge that needs XLR outputs, it’s the arm cable that does. Some arm makers, and arm cable makers as well, DO offer XLR termination, and some phono amp makers XLR inputs. An arm cable that is terminated in RCA plugs can have them replaced with XLR’s.

Thanks everyone so far.  I have the Hifi News test record, with the following track:

Band 5: Cartridge alignment (azimuth) test (300Hz L+R +6dB)

This track is designed for cartridge Azimuth adjustment. To check the stylus is absolutely vertical, play the track in stereo and you should get identical output from each speaker, but when the amplifier is switched to mono you will hear nothing.

I also have the Cardas Test Record, that has a 1K test tone.

It seems like my options are as follows:

  1. Swap the cartridge leads on one channel, use the out-of-phase band, and reduce to zero on a voltmeter (but I don't want to bother swapping leads)
  2. Leave the leads alone, use a voltmeter as Eric describes, and play an in-phase test tone to get to zero
  3. Attach a recording device to my preamp and adjust azimuth so an in-phase test tone yields equal values per channel
  4. Contact the manufacturer of my preamp and see if I can use Y adaptors to blend the two channels and then split the blended signal into two, downstream of my phono preamp, in order to get mono. Then use the out-of-phase track to zero the azimuth.

Any other thoughts about best solutions among these, or are there other good ideas?  I will email the manufacturer of the phono stage (Einstein) about blending the signal into mono.

Thanks, Peter

You need the maximum AC voltage!! :) Same connection.
I had thought about suggesting that, Erik. But I’m not certain that maximizing the voltage difference between out of phase signals on the two channels would allow as precise a determination as nulling out their difference by adding them together.

Envision that azimuth has been optimized precisely, by maximizing the measured difference between out of phase signals. Then envision that a small misadjustment of the azimuth is introduced. It seems to me that even though the amplitudes of the two signals might become significantly unequal, with one becoming larger and one becoming smaller, their difference might not change much if at all from the previously established maximum. Whereas if they were added together they would no longer sum to zero.

Best regards,
-- Al


Al - what do you think of my suggestion #2 above?  If the signal is in phase (e.g. 1kz test tone), measuring the difference between channels with a voltmeter would give me an accurate zero, no???
Hi Peter,

Regarding your question just above, the following comment I made earlier applies. In addition to option 2, it also applies to option 3. The bottom line is I don’t know :-)
I would wonder if the corresponding lateral movement of the stylus would allow as precise an adjustment of azimuth as the vertical movement that occurs while playing an out of phase track would allow. Maybe it would and maybe it wouldn’t; I’m just not sure.
Regards,
-- Al