No. Because there is no standard on what type of output stage to use. The bias current is set to give the best performance for the type of output stage in terms of linearity and distortion, while keeping an eye on the power dissipation (heat sinking is very expensive).
There really is no such class as Class AB, it is used to describe the mode of operation when a Class A amp hits a low impedance and one of the pairs of output transistors stops conducting. Almost all amps are Class B, where they are biased just above the transistor Vbe so as to keep the transistors conducting in that dead band from -0.7 to +0.7 volts to avoid crossover distortion. Increasing the bias beyond that point (but less than Class A) may provide benefit by reducing crossover distortion further but you pay a price with gm doubling distortion.
How much above that minimum Vbe setting do manufacturers bias is done to maximize performance. For example, a Darlington output stage requires about 100ma of bias and a complimentary feedback pair output stage requires about a tenth of that for best performance. So it is not possible to standardize something like this.
There really is no such class as Class AB, it is used to describe the mode of operation when a Class A amp hits a low impedance and one of the pairs of output transistors stops conducting. Almost all amps are Class B, where they are biased just above the transistor Vbe so as to keep the transistors conducting in that dead band from -0.7 to +0.7 volts to avoid crossover distortion. Increasing the bias beyond that point (but less than Class A) may provide benefit by reducing crossover distortion further but you pay a price with gm doubling distortion.
How much above that minimum Vbe setting do manufacturers bias is done to maximize performance. For example, a Darlington output stage requires about 100ma of bias and a complimentary feedback pair output stage requires about a tenth of that for best performance. So it is not possible to standardize something like this.