The 1:10 ratio does matter as you want to minimize voltage loss at the load input.
Very true, of course. However in itself that voltage loss would only result in a very slight reduction in overall system gain. A much more significant issue arises if the output impedance varies significantly as a function of frequency, AND does not satisfy the 1:10 ratio at all frequencies. In that situation the result can be both frequency response irregularities and undesirable phase shifts at some frequencies.
As George alluded to that is most likely to be an issue in the case of tube-based components, many of which use a coupling capacitor at their outputs. The impedance presented by a capacitor increases as frequency decreases, so the output impedance in the deep bass region can be much higher than the specified output impedance, which is usually based on a mid-range frequency such as 1 kHz.
To cite some examples, Audio Research recommends a minimum load impedance of 20K for almost all of their line stages and preamps, and 60K in the case of a few older designs. And the manual for the fantastic Herron phono stage recommends a minimum load (line stage input impedance) of 50K "for optimum performance," although I recall a member here stating that Keith Herron verbally indicated that 20K is likely to be satisfactory for most users.
Regards,
-- Al