The output impedance of a preamp has to be much power than 600 ohms in order to *drive* 600 ohms. Also, the output impedance at 5Hz should be the same as it is at 1KHz so there is no low frequency rolloff. You can see right away why most preamps will instantly loose bass if actually subjected to a 600 ohm load!
The 99.99999% figure quoted for purity is outright preposterous. Even if you could get that kind of purity in the metal you can't extrude it into wire and still have that. For example if you have Teflon insulation, the extrusion temperature is quite high, which is guaranteed to cause oxidation at a rapid rate. BTW this is why you don't see copper wire that is Teflon insulated.
Actually if you work the math the opposite is true- with no current at all then the minor resistances, odd diode effects and the like will become more prominent.
The 600 ohm standard was developed over 50 years ago and successfully eliminated cable variables. It made hifi possible- now it was possible to hang a set of microphones in the ideal location in any hall, without concern for where the recorder had to be to make that possible.
IOW the vast majority of all recordings use this technique. This is why a classic Mercury or RCA sounds better as you improve the playback- you don't hear cable problems in the recordings because there are none.
I have often marvelled at the fact that audiphiles are willing to pay large sums for cables, yet are often uninterested in a proven system that eliminates cable artifact altogether.
Think of it this way. If you have two cables and one sounds better than the other, right away you have to be suspicious of both. Why? Next year, the manufacturer of the 'better' cable will have a new model that is more expensive yet, and sounds better- we have all seen this! How about a system where the cheapest cable sounds as good as the best cable? Wouldn't that be something of interest?
The 99.99999% figure quoted for purity is outright preposterous. Even if you could get that kind of purity in the metal you can't extrude it into wire and still have that. For example if you have Teflon insulation, the extrusion temperature is quite high, which is guaranteed to cause oxidation at a rapid rate. BTW this is why you don't see copper wire that is Teflon insulated.
I would suspect that bigger current would cause bigger voltage drops on said junctions (or impurities) everything else being equal.
Actually if you work the math the opposite is true- with no current at all then the minor resistances, odd diode effects and the like will become more prominent.
The 600 ohm standard was developed over 50 years ago and successfully eliminated cable variables. It made hifi possible- now it was possible to hang a set of microphones in the ideal location in any hall, without concern for where the recorder had to be to make that possible.
IOW the vast majority of all recordings use this technique. This is why a classic Mercury or RCA sounds better as you improve the playback- you don't hear cable problems in the recordings because there are none.
I have often marvelled at the fact that audiphiles are willing to pay large sums for cables, yet are often uninterested in a proven system that eliminates cable artifact altogether.
Think of it this way. If you have two cables and one sounds better than the other, right away you have to be suspicious of both. Why? Next year, the manufacturer of the 'better' cable will have a new model that is more expensive yet, and sounds better- we have all seen this! How about a system where the cheapest cable sounds as good as the best cable? Wouldn't that be something of interest?