Wow, everyone ran straight past the teachable moment. In person, I would have stopped you on "light bulb". Will keep everything to an observation based explanation except for one ending point.
Decades ago, I worked in an academic facility where in the lecture rooms industrial lighting was installed. As they wanted architectural appearances to be circa late 1800's, standard fluorescent lights were not installed. But rather there were glass shells with recessed bulbs that had either sodium or argon or some type of gas that took its time before saturation. This process took around 20 minutes and most said it was akin to sunrise and seemed to follow the same timing and perception of the increasing intensity of brightness.
If you have ever looked at streetlights, they take their time in achieving full brightness. The same goes for most stadium lights.
Consumer LED "bulbs", actually cycle off and on rapidly according to the frequency of electricity and many would notice the difficulty of using auto-white balance on cameras or making videos in a setting were the only light source was common consumer LED fixtures.
For incandescent light bulbs, if you have a chance to view a slow motion video you will see that the tungsten filament (the thin wire metal "w" shape in the middle) "slowly" starts to glow...in slow motion of course. Your statement was "off" and "on". Now, why tungsten, it its because it has the highest melting point of pretty much any easily obtained metal. So, it can glow brighter (without starting to melt) and be less an red/orange glow and more toward white. Now, the more interesting fact is that as metals heat, they become more resistive to electricity. The electrons are moving around much more so its harder for electricity to pass. This is why supercomputers are often cooled to ridiculously low temperatures so the specialty types of metals in the processors, backplanes, etc. can "super" conduct. Not all metals do superconducting but keeping lower temperatures generally actually assists the flow of electricity. So why mention that I say. If you have a metal in a light bulb and its resistance increases as its temperature increases, it means the brighter the bulb gets, even if it takes only 1 or 2 seconds to get to that point, it requires less electrical current. Therefore, "light bulbs" are more efficient after they've been running for a second or two. They require the most "electricity" (energy) in the first milli seconds after being turned-on...and then the requirement goes down, quickly. Because, there is an inverse relationship between "current" and "resistivity" if you hold "voltage" (e.g. 120V or 115V) constant. So, if a child is flicking a light off and on, it soaks more energy in its brief moments of illumination of flickering than if the bulb were just left on. Now, I'm only referring to the energy consumed when it is on and producing light. An "off" bulb obviously takes zero energy by comparison. And, lights once they hit a steady state, wouldn't improve efficiency being left on for days compared to a couple of minutes. I'm referring to the multitude of things that happen in the snap of a finger when you turn a light "on".
Now, in terms of audio and video electronics, there's alot more human engineering going on to make you think its quickly usable but if you actually count the number of seconds its already warmed up. Pretty much what everyone said above is the explanation and analogies for electronics. And yes, cooler is better, but in audio, 90% of energy is wasted in amplification as heat and then 90% of the energy going to the speakers is also wasted as heat. So heat is unavoidable at a certain range of "warmth" but the ventilated systems are designed to work within those temperatures and unless its supercomputing, the lower operating temperatures likely would not improve the sound. But, if you wouldn't slam the gear into 1st and stomp your foot to the floor of your $50K car the instant the engine turned-over, I wouldn't do it with a $10K or $50K collection of audio equipment.
Decades ago, I worked in an academic facility where in the lecture rooms industrial lighting was installed. As they wanted architectural appearances to be circa late 1800's, standard fluorescent lights were not installed. But rather there were glass shells with recessed bulbs that had either sodium or argon or some type of gas that took its time before saturation. This process took around 20 minutes and most said it was akin to sunrise and seemed to follow the same timing and perception of the increasing intensity of brightness.
If you have ever looked at streetlights, they take their time in achieving full brightness. The same goes for most stadium lights.
Consumer LED "bulbs", actually cycle off and on rapidly according to the frequency of electricity and many would notice the difficulty of using auto-white balance on cameras or making videos in a setting were the only light source was common consumer LED fixtures.
For incandescent light bulbs, if you have a chance to view a slow motion video you will see that the tungsten filament (the thin wire metal "w" shape in the middle) "slowly" starts to glow...in slow motion of course. Your statement was "off" and "on". Now, why tungsten, it its because it has the highest melting point of pretty much any easily obtained metal. So, it can glow brighter (without starting to melt) and be less an red/orange glow and more toward white. Now, the more interesting fact is that as metals heat, they become more resistive to electricity. The electrons are moving around much more so its harder for electricity to pass. This is why supercomputers are often cooled to ridiculously low temperatures so the specialty types of metals in the processors, backplanes, etc. can "super" conduct. Not all metals do superconducting but keeping lower temperatures generally actually assists the flow of electricity. So why mention that I say. If you have a metal in a light bulb and its resistance increases as its temperature increases, it means the brighter the bulb gets, even if it takes only 1 or 2 seconds to get to that point, it requires less electrical current. Therefore, "light bulbs" are more efficient after they've been running for a second or two. They require the most "electricity" (energy) in the first milli seconds after being turned-on...and then the requirement goes down, quickly. Because, there is an inverse relationship between "current" and "resistivity" if you hold "voltage" (e.g. 120V or 115V) constant. So, if a child is flicking a light off and on, it soaks more energy in its brief moments of illumination of flickering than if the bulb were just left on. Now, I'm only referring to the energy consumed when it is on and producing light. An "off" bulb obviously takes zero energy by comparison. And, lights once they hit a steady state, wouldn't improve efficiency being left on for days compared to a couple of minutes. I'm referring to the multitude of things that happen in the snap of a finger when you turn a light "on".
Now, in terms of audio and video electronics, there's alot more human engineering going on to make you think its quickly usable but if you actually count the number of seconds its already warmed up. Pretty much what everyone said above is the explanation and analogies for electronics. And yes, cooler is better, but in audio, 90% of energy is wasted in amplification as heat and then 90% of the energy going to the speakers is also wasted as heat. So heat is unavoidable at a certain range of "warmth" but the ventilated systems are designed to work within those temperatures and unless its supercomputing, the lower operating temperatures likely would not improve the sound. But, if you wouldn't slam the gear into 1st and stomp your foot to the floor of your $50K car the instant the engine turned-over, I wouldn't do it with a $10K or $50K collection of audio equipment.