CD output voltage too high for preamp: what to do?


I have an Audible Illusions preamp which is designed for a 1.5v input signal. I am told that this is standard. My Ayre CX-7e puts out 2.25v. They play beautifully together but I have too little control over volume: I cannot use the preamp vol controls beyond 9 o'clock because it is way too loud. Likewise it is hard to get just the right volume in the limited range available. This is worse with some CDs for reasons I do not know.
Audible Illusions will change the preamp attenuation board for $275, but before I do so I wonder how others have dealt with the problem of newer CD players with higher outputs than their preamps were designed for.
gmargo
BTW, 2V (at 0dB level) is the output standard for CD players although, as you already know, it is often ignored. In crude A/B tests, the louder player will always sound more impressive.

That said, another major factor in your problem is the input sensitivity of the power amp which may be a bit high as well. After all, 2.25V is only 6dB more than 1.5 and should be within the ambit of the volume adjustment.

Kal
Post removed 
I have a similar situation with a high gain amp (.8v for full output, 30 db gain) and was having the same issue with the volume control, both for digital (1.5v) and analog playback. I just bought the Rothwell attenuators and they work for me. Now I can go to about 12 o'clock before the level begins to get too loud (previously between 9 and 10 was the norm). I don't notice additional noise and used on the inputs of the amp I've lowered the gain from the preamp about 10db. Since this seems to be an issue with the CDP, I would first try these on the CD input of the preamp so as not to interfere with an other sources you may be using.
The fancy term "Attenuator" disguises the fact that we are talking about a couple of resistors. It's just a voltage divider. I would simply build it into the input jacks of the preamp. Do it with ordinary resistors, and when you get the volume right you can replace them with "audiophile" resistors if that makes you feel good.
This is a bit of a curiosity for me. If the output of a DAC is 1.5V, is there a minimum input impedance the preamp should have to accomodate the DACs output? Does the preamps gain play into this equation? My preamp has 4 gain settings, 11db, 14db, 17db, and 20db. I have always used the preamp with the 11db setting, but since I am now using the Rothwell attenuators it would appear that only 1bd of gain is getting passed into the amp. That seems pretty low to me. It's been suggested by others to go to a passive, but I researched this and don't want to risk giving anything up in dynamics. The preamp is a Joule Electra LA-100 MkIII (20k input impedance, 300 ohn output impedance).