Burn-in and Equipmemt Value


A lot of people strongly believe that burn-in results in better sound quality and some will even pay more for cables that have the burn-in done by the manufacturer. If burn-in is real, then why isn't used equipment worth more than it is? At a minimum, shouldn't the demo equipment from a respected retailer be worth more when manufacturer warranties are still in play and the equipment is essentially new?

As a side question, why is it that any perceived change in a system where burn-in is credited it is assumed that the burn-in was on the newest piece of equipment? Some users report changes from burn-in hundreds and thousands of hours down the road.

I understand break-in on speakers and tube amplifiers, but struggle greatly with things like cables and digital sources.
mceljo
IIRC, a leading cartridge guru says that a few hours is sufficient for mechanical "break-in" of a cartridge. The next 50 -100 hours is necessary for the cartridge's guts to be bent into the shape required by poor installation, which accounts for the evolution in sound.
I cannot say about cables, but when my Qol was "breaking in", there was a single moment when the sound changed and improved. It was subtile, but very real. I also noted improvements in my system when I added Vibrapods under the components, and I added them one component at a time. The point is to try to make everything perform at its best in your environment and system, and the fun is in trying to do the free or cheap things to get better sound.
"Burns In" Time is the Time Frame from the owner to align his ears to his latest Investment. To be more precise, the time he needs to deny that he does not like it...
How can a component have an "optimum operating temperature" and be rated at that temperature and not experience a "burn in" of sorts? No one here doubts the notion about "warm up" and when a system sounds its best.

The signal is energy that passes along or through a wire or component and it's in the nature of the beast that it warms up accordingly, albeit incrementally (unless it's a tube), but the effect is there.

So the wire or component does wear in due to the constant off/on nature. It would follow that warm up time would decrease a bit and the wire or components' properties would change ever so slightly. Is it that far fetched to say that with a highly resolving system one can hear it? I think it's totally normal and expected and this debate is nothing more than another exercise.

All the best,
Nonoise
Take speaker cables for example. One would say the current flowing in them does something to them that is attributed to "break-in" so it seems reasonable that this same effect would continue to have the effect and at some point become a degradation of the materials involved.

How long do vibration isolation devices take to break in?