I remember seeing a local recording engineer set up the mics a good 150 feet from the recorder, and the recording was fabulous. Years later I had that event in mind in setting the goal to support the same operation.
It's interesting that you say this . . . typical mic preamp inputs have usually strived to provide an input Z many times the source impedance. Old (transformer-coupled) tube consoles are usually in the area of 1200 to 1500 ohms, and were driven by microphones with very low sources impedances (an RCA 44BX is like 30 ohms). Nowadays, most microphones are standardized to have 150 ohm output impedances, and console input impedances have risen as well - 2.5K to 5K is common, and some have higher (like 10K). The main reason is that microphones generally have a flatter (or at least more predictable) response into higher loads, because their source impedance (especially in dynamics and ribbons) can vary significantly with frequency. 600 ohms is about the very lowest input impedance that even a microphone is likely to see (maybe when splitter transformers are used) . . . and consumer audio balanced outputs very frequently have higher output impedances than a microphone's 150 ohms.
I think we've been down this road about 600 ohm terminating resistors before, but I would still like to remind everybody that 600 ohms is a MEASUREMENT standard borrowed from telephone equipment - which are power-transfer systems (equal source and termination impedance). The standard in audio interconnection, whether balanced or unbalanced, has always been a voltage-transfer system. Audio measurement equipment has frequently had 600-ohm source and termination impedances so that they could accurately measure signal level in dBm (dB refererred to 1mW into 600 ohms).
I absolutely agree with you that high-end consumer audio equipment, that has XLR outputs, should be able to drive a 600 ohm load with full performance. I also understand that there exists much equipment with transformer-coupled inputs, where the designer has chosen to increase the transformer step-up ratio to improve noise performance, resulting in an input impedance that may be as low as 600 ohms.
But adding a 600 ohm terminating resistor doesn't make cable reactance disappear, and it's certainly NOT a "standard". Even if you wanted to treat the interconnect as a transmission-line (say you had 1000-foot runs between your preamp and amp), then both the source and termination impedances should be the same as the cable's intrinsic impedance, which is more like 150 ohms. (That's why AES/EBU is balanced and operates at 110 ohms . . . it was designed to use standard studio microphone or interconnect cables).
In the end, when a manufacturer chooses to put in a 600 ohm terminating resistor, it's a good bet that the equipment driving it (especially if its from a different manufacturer) will not be performing at its best. Their main effect is to swamp the effects of any impedance unbalances in the input circuitry, thus improving the equipment's common-mode rejection performance. A crude trade-off, IMO, and again, definately non-standard.