cable burn-in / system burn-in


So many of us just take what we hear as being the gospel truth about equipment. I know I do, a lot of the time, because there is just to much work and cost to prove it. I have to finally agree with the burn-in effect. After several years, and multiple equipment changes, I can say, with out a doubt, equipment and cable burn in makes a very large impact on the sound. I just started my system again after being down for a few months. It has taken about 40hrs of play time before it has started to sound good again. I have a cd that I always play to hear the effect, which I am very familiar with. So it is kind of scientific, and not just arbitrary. So there you have it...
johnhelenjake
No, Dave, you guys just have different systems, goals, preferences, etc. I have tried side-by-side comparisons with the two of the same cable, one cooked, one not. The uncooked always sounded better to me once they had settled in on their own. I do think that even cooked cables settle in after a while, perhaps adjusting to the difference in voltage/current between the cooker and system components.
For what it's worth, I am told by my high-frequency specialist friends and colleagues that it was well known to the scientists working in high-frequency laboratories that new cables "settled in" after a while. I am unaware of any "cable cookers" used in the "serious" radio industry. It seems, though, that the cables had to be used in the very application in which they were settled in. In other words, you don't get a high frequency cable to work well if you let it settle in as a power cable for a while. Has to be the same high frequency application.

I don't know if this stuff was ever published, but it was known and talked about according to people who used to work in the field. I don't know if it was measurable, though. I'd have to ask about this -- highly INTERESTING!

My present theory is that it is but simple degaussing that is going on -- nothing else. You take magnetic domains and keep making them smaller and smaller. That's what degaussing is all about. It is done by taking a signal and making a "fade-out" out of it. Similar to what the procedure used to be with the old CRT monitors, when you pressed the DEGAUSS function.

Now, when you look at a music signal (or any sound signal, for that matter), it is all a bunch of fade-outs. That's what echo and reverberation and all the tails of all the percussive sounds are.

I have taken this theory and practiced with it over the years.

The result was that 100 short fade-outs all the way down to zero of approximately 10 seconds each sounds worse that one very large fade-out going from full power to zero in 100 x 10 seconds.

I have not yet been able to discern any other improvement in quality once the fade-out reached 7 days. In other words, I could not hear a difference between a cable processed with 7 days of non-stop single fade-out compared to another equally new cable processed (faded out on it) for 10 days non-stop.

Who has had any other tangible results and methods?

Louis Motek
Louis,

Yes that is correct. At high frequencies things like capacitance and dielectric can have a more pronounced effect and a drift can be expected as equipment ages and settles in. However audio is NOT high frequency.
Yes, especially power cables!

Today I tested the newest theory. Without a shadow of a doubt, a power cable played in a system for two months every day doesn't sound anywhere near as beautiful as the same cable faded out on over 7 days. So, settling in is different from burn in, if it exists.

It could also be that, as I said earlier, music is a bunch of small fade-outs, and so it could be that some small amount of degaussing does occur with brand new cables, whereby completely degaussed cables (7 days of one continuous fade-out procedure) positively dwarf this minute change of small fade-outs which constitute the music signal.

I think that if this is so, burning in and settling in must not be considered two names for the same concept. They are different things altogether. The burning-in happening once and for all time, and the settling in needing to happen every time you turn off your system for a few days or weeks.

DANGEROUS THOUGHT: There is a logical and perhaps scary conclusion to be drawn from this. This means that you can possibly WORSEN the sound of a cable purposely by applying a very long fade-in with abrupt ending. This is similar to playing music backwards. If this is possible, it becomes even harder to believe in cable comparisons, since the candidates might possibly be tampered with purposely by signals which knowingly alter the sound in BOTH directions. Good or cable "A" and bad for cable "B". Then a blind test in front of the unsuspecting public... all of whom will choose cable "A".

Louis Motek
Louis you lost me on this fade out thing. For me bottom line is that when I have a system that is taken down turned off, cables and equipment moved, then once the system is re-assembled it takes time for the system sound to come together and play to its potential. That just happened here. If I add new gear, break-in / burn-in (call it what you want) is needed. With equipment that does not need break in then the equipment and cables need to "settle in" before the sound comes together. For me I look at settling-in in the shorter term of time, break-in depending on the equipment involved can take hundreds of hours. The accompanying change in sound is very real and I use bass and its ability to go from lean and thin to pressurizing a room as a palpable example of one of the possible affects of settling in, break-in or burn-in. In my post above, I use bass pressure in a room as a "palpable" example. Bass pressurizing a room gets outside the realm of the more subjective perceptions audiophiles talk about like "imaging," and "soundstage depth."