There's a differnce between warm up and burn in.
Warming up is pretty obvious. Ask a race car driver if there's a difference between cold and warm tires. Similarly, some electronic equipment may have an optimal operating temperature for most efficient operation.
As far as burn in goes, people debate it. However, go to the nearest university engineering or science library and look for a book called "A Million and One Random Numbers". (I think that's the correct name). It's not too exciting. It contains a million random numbers. But read the preface.
In a nutshell, here's the story. In the early days of computers, scientists at Stanford University thought it would be good to develop a handy list of random numbers using the convenience of a computer to generate them. The result is the book. It's not always referenced, but pretty well every math or science book that has a table of random numbers at the back uses an excerpt of this book for the table. The scientists were concerned that the random generation of numbers by the computer could result in some numbers appearing more frequently at the start of the process. Just like flipping a coin ten times isn't going to always produce exactly five heads and five tails. It would all even out after time though. However, they had a concern. What if more frequently appearing numbers at the start of the process conditioned or burned in the electronic circuit so that those same numbers were more likely to appear in subsequent number generations? Then the numbers wouldn't be truly random. So they had to perform inferential statistical tests on the hypotheses that the numbers were or were not random. They had to regnerate some numbers because at times they were not confident that the number gneration was truly random because of the hypothesized burn in. They coudn't "prove" burn in but their inferential statistics couldn't always discount it either; so they had to take its possible existence into account.
Fascinating stuff. Anyway, I think it's hilarious when scientific types dispute burn in, and then they use textbooks with random number tables that were formulated with regard to the possibility of burn in. The irony is wonderful.
So, if you discount the possiblity of burn in, there's some pretty serious brain power and formidable statistical analyses a person will have to refute.
Warming up is pretty obvious. Ask a race car driver if there's a difference between cold and warm tires. Similarly, some electronic equipment may have an optimal operating temperature for most efficient operation.
As far as burn in goes, people debate it. However, go to the nearest university engineering or science library and look for a book called "A Million and One Random Numbers". (I think that's the correct name). It's not too exciting. It contains a million random numbers. But read the preface.
In a nutshell, here's the story. In the early days of computers, scientists at Stanford University thought it would be good to develop a handy list of random numbers using the convenience of a computer to generate them. The result is the book. It's not always referenced, but pretty well every math or science book that has a table of random numbers at the back uses an excerpt of this book for the table. The scientists were concerned that the random generation of numbers by the computer could result in some numbers appearing more frequently at the start of the process. Just like flipping a coin ten times isn't going to always produce exactly five heads and five tails. It would all even out after time though. However, they had a concern. What if more frequently appearing numbers at the start of the process conditioned or burned in the electronic circuit so that those same numbers were more likely to appear in subsequent number generations? Then the numbers wouldn't be truly random. So they had to perform inferential statistical tests on the hypotheses that the numbers were or were not random. They had to regnerate some numbers because at times they were not confident that the number gneration was truly random because of the hypothesized burn in. They coudn't "prove" burn in but their inferential statistics couldn't always discount it either; so they had to take its possible existence into account.
Fascinating stuff. Anyway, I think it's hilarious when scientific types dispute burn in, and then they use textbooks with random number tables that were formulated with regard to the possibility of burn in. The irony is wonderful.
So, if you discount the possiblity of burn in, there's some pretty serious brain power and formidable statistical analyses a person will have to refute.