First, some comments on points you raised prior to the six questions at the end of your post:
There are a number of online calculators which can be found that will provide dBu to voltage conversions. 0 dBu corresponds to 0.775 volts. As you realize, your speaker provides adjustable sensitivity covering a range of -6 to +6 dBu for 100 dB SPL at 1 meter. -6 dBu corresponds to 0.775/2 = 0.39 volts. +6 dBu corresponds to 0.775 x 2 = 1.55 volts.
At the most sensitive setting (-6 dBu), **if** the speaker were able to handle an input of 4.4 volts in a reasonably linear manner and without its self-protection circuitry kicking in (and it apparently would kick in in this case), the result would be an SPL at 1 meter of:
100 dB + 20 x log(4.4/0.39) = 121 db,
where "log" is the base 10 logarithm.
So your DAC certainly provides enough voltage to drive the speaker, without the additional gain of a preamp.
However, as you realize sonics may be adversely affected by impedance issues in this case. I’ll first say that the 10x rule of thumb guideline is commonly stated in an oversimplified manner. IMO a proper statement of it is as follows (I’m quoting from a post I made in an earlier thread):
To assure impedance compatibility the 10x rule of thumb guideline should be applied at the frequency for which the output impedance of the component providing the signal is highest. Most impedances are specified at a mid-range frequency such as 1 kHz.....
If as is often the case ... [the output impedance at the highest audible frequency] is not known, and is not indicated in published measurements (such as Stereophile often provides), then to be safe a considerably higher ratio than 10x should be used, something like 50x or 75x IMO. Especially if the component is tube-based and is likely to have a coupling capacitor at its output.
Also, to clarify a common misconception I should add that failing to meet that guideline does not necessarily mean that there will be an impedance compatibility problem. It depends on how much **variation** there is in the output impedance over the frequency range. But meeting that guideline (at all audible frequencies) assures that there won’t be an impedance compatibility problem.
In the case of your DAC my guess is that there isn’t a great deal of variation of the output impedance over the frequency range, but nevertheless the impedances that are involved IMO due not appear to be optimal or even suitable for use with a passive preamp, whether it be resistance-based or transformer-based. And cable capacitance, which is proportional to length, would also be a potential issue with a passive preamp, as you mentioned.
Mains voltage:? 100,120, 220, or 230 V according to region - not exactly sure what this is.This simply means that the speaker is designed to be operated at the AC line voltage that is used in the country to which the speaker is supplied. 120 volts in the case of speakers supplied to the USA, of course.
Regarding your questions, I don’t see the combination of a passive preamp and a buffer as being an attractive solution, given that there are impedance concerns involving both the source and destination components. There are countless numbers of solid state preamps, on the other hand, that will be able to drive 10K balanced inputs with no problem (as I mentioned earlier that can be verified by looking at Stereophile’s measurements, if they have reviewed the particular component, or alternatively if the component’s specified nominal output impedance is say 200 ohms or less), and that will provide an input impedance of say 47K or more, which would certainly be suitable. And an input impedance considerably lower than that could very conceivably also be suitable, although I’d check with Denafrips to be sure.
Good luck. Regards,
-- Al