Ckoffend and Jaffeassc, I would try it both ways but I would expect that the amp and its power tubes would be more comfortable on the 4 ohm tap. The transformer actually does what its named for- it transforms impedance.
Here's how that works:
Let's say you have a set of power tubes and they are expecting 3000 ohms plate-to-plate. Now if you load the 8 ohm tap with a 4 ohm load, you have just cut the impedance in half, so the power tubes are now seeing 1500 ohms plate to plate. The result will be higher distortion and likely significantly less output.
There are output transformers out there that are not that efficient on their 4 ohm taps, so with some amps the 8 ohm tap may actually be preferred, but most other tube amps will likely be set right on the 4 ohm tap. This is of course assuming that the match that the power tubes want to see is in fact satisfied by the windings of the transformer, which is often not the case as the combination is an approximation. So I would expect that there are also some output transformer situations where the 4 ohm tap is getting the tubes closer to the ideal.
If the 4 ohm tap is the correct tap, then loading the 8 ohm tap with 4 ohms will cause the power tubes to work harder, as more of the power they produce will be dissipated inside the tubes rather than the load. So there is a certain gamble in trying 4 ohms on the 8 ohm tap, although tubes are forgiving enough that its not a risk to try- more of a risk to keep it set up that way.
This can be a big topic so I am glossing over a few things, but that's it in a nutshell...