@ramtubes
And if burn in occurs in cabling, why don’t we see people producing these measurable results between a new and burned in cable? The suspicious thing is that when most cable manufacturers are hyping either the technical reasons why their cables produce better sound, or telling you the cables need burn in, they are always appealing to some objective, technical phenomena whose existence is known because it was measurable. "Here’s a technical problem with cables you need to know about, that we have solved via our manufacturing process!"
But when they tout that they have "solved" one or more objective technical problems in cable design, they typically don’t demonstrate they’ve solved the problem in any measurable way. Instead, the results go straight to hype, marketing and the subjective impressions of audiophiles and reviewers. Same with burn in. Funny that.
As I’ve mentioned before in such discussions, audiophiles think everything changes substantially with "burn in," fuses, resistors, cables. And yet companies like Vishay and others - responsible for selling cabling, resistors etc to professional industries - industrial, computer, and incredibly spec-sensitive applications in avionics, military and aerospace design - don’t go on about "burn in." If the specs actually changed that much over time of a cable or resistor or fuse from when it was delivered new to in-use, this is something customers employing them in sensitive applications would need to know (and it would obviously be very problematic if those industries could not rely on a product actually meeting the stated specs, out of the gate).
And, again, you see (as far as I know) none of this "please burn our product in for 100 hours before application, as the specs will change" when the rubber hits the road, when you sell these things to engineers who can identify B.S. from marketing.
For those who will likely disagree: If burn in exists in these devices why did we not know about it until recently. I find no references to burn in in the 50s 60s 70s.. when did it start?
And if burn in occurs in cabling, why don’t we see people producing these measurable results between a new and burned in cable? The suspicious thing is that when most cable manufacturers are hyping either the technical reasons why their cables produce better sound, or telling you the cables need burn in, they are always appealing to some objective, technical phenomena whose existence is known because it was measurable. "Here’s a technical problem with cables you need to know about, that we have solved via our manufacturing process!"
But when they tout that they have "solved" one or more objective technical problems in cable design, they typically don’t demonstrate they’ve solved the problem in any measurable way. Instead, the results go straight to hype, marketing and the subjective impressions of audiophiles and reviewers. Same with burn in. Funny that.
As I’ve mentioned before in such discussions, audiophiles think everything changes substantially with "burn in," fuses, resistors, cables. And yet companies like Vishay and others - responsible for selling cabling, resistors etc to professional industries - industrial, computer, and incredibly spec-sensitive applications in avionics, military and aerospace design - don’t go on about "burn in." If the specs actually changed that much over time of a cable or resistor or fuse from when it was delivered new to in-use, this is something customers employing them in sensitive applications would need to know (and it would obviously be very problematic if those industries could not rely on a product actually meeting the stated specs, out of the gate).
And, again, you see (as far as I know) none of this "please burn our product in for 100 hours before application, as the specs will change" when the rubber hits the road, when you sell these things to engineers who can identify B.S. from marketing.