I'd like to weigh in on what strikes me as truly being a gentlemen's misunderstanding.
I believe that Raul's main point is that when there is a presumed, widely-adopted technical standard such as the RIAA equalisation curve (or 75uS FM de-emphasis, or the 200nW/M cassette Dolby level) that any deviation from that standard, makes for an inherently inaccurate reproduction. I in general agree with him, including as it pertains to a "statement"-level phono preamplifier intended for magnetic/dynamic cartridges . . . most errors here are the result of cost-cutting, poor engineering, and poor manufacturing tolerances. Raul's specification of +/- 0.1 dB error for this particular part of the reproduction chain is definately attainable, and quite arguably expectable given the high prices of many commercial units.
I also think that when a manufacturer claims something like "don't worry about that [common-practice technical standard] . . . it's not necessary with my [new, better, revolutionary] design" -- this is the grounds for the deepest skepticism. But in the PARTICULAR case of the Strain Gauge cartridge . . . there may be some technical/historical validity to this claim . . . and unfortuneately, Mr. Ledermann doesn't do a very good job explaining the subject in his literature, or in his posting . . . so I'm going to be presumptious and give it a try:
Historically, the technical standards for the RIAA curve (318uS, 3180uS, and 75uS) come from an amagalm of industry practices from the major record manufacturers, as recorded music (and high-fidelity sound) became more and more popular after World War II. At the time, the methods of cutting records were carried over from the earilest days of electronic recording before the War. In these early days, most electrical phono cartridges were crystal types, there was NO EQUALISATION used on playback. Thus, the engineers developed electrical pre-equalisation practices for cutting records, that were designed to give the best sounding/most accurate reproduction they could, with the types of pickups (both electrical and acoustical) that most people were using.
Enter the 1950s . . . and "magnetic" phono cartridges (moving-magnet and moving-coil) became the dominant high-performance type available, as a result of refinement of manufacturing techniques, and improved magnetic materials developed during the War (i.e. Alnico). The only caveat was that these magnetic cartridges respond to the VELOCITY of the stylus movement, rather than the AMPLITUDE of the stylus movement, as crystal types had. Because of thus, magnetic cartridges REQUIRED EQUALISATION in the preamplifier. (Crystal cartridges were replaced by "ceramic" types that still didn't require equalisation - and continued as the common type for cheaper equipment until the Digital Revolution [sic] of the 1980s.)
This change in standards between phono cartridges and preamps wasn't a big deal at the time (late-1940s/early-1950s), because most of the public was buying integrated phono/tuner/preamp/amp/loudspeaker console sets. Audio dorks were buying units like the McIntosh C8 and Marantz Model 1, with highly-customizable equalisation options. It wasn't until the late-1950s/early-1960s, with magnetic cartridges being the universal choice for high-quality systems, and this new "stereo" thing on the horizon, that there was a need for true standardisation of recording/reproducing equalisation. So the RIAA set a standard that was a pretty close "midpoint" to the current equalisation practices of the major record manufacturers . . . and these practices were refinements of the techniques developed for crystal cartridges.
So the hi-fi world went completely to magnetic (velocity-sensitive) cartridges with standard equalisation for recording and playback, and ceramic (amplitude-sensitive) stayed around, with no playback EQ. This worked well, because the average frequency response of a CARTRIDGE was, in general, MUCH less accurate than the electronic equalisation found in an average stereo . . . and ceramic cartriges were "close enough" for the cheap systems in which they were found.
But today, we're all trying to get that absolute last "n-th degree" of performance from our turntables and record collections, and we have more high-quality cartridges (both new and old) to choose from than ever . . . so you have guys like Raul that feel the only way to truly hear what a cartridge and tonearm have to offer, is to get the electronics so precise that they're completely out of the picture. I agree with him, and applaud him for his efforts.
BUT we also have a guy like Peter Ledermann, who has developed a strain-gauge cartridge system . . . and this system is AMPLITUDE-sensitive, NOT velocity-sensitive. And this puts the strain-gauge system in a unique position, because it may actually be the first time in over fifty years that there has been serious development in ANY ultra-high-performance, amplitude-sensitive phonograph pickup. And of course, the concept of "absolutely correct RIAA equalisation" is completely non-sequitur to an amplitude-sensitive cartridge . . . and this has ALWAYS been the case . . . we've just all forgotten.
So to the question of, "does the Strain Gauge system feature extremely accurate RIAA compenstion?", the answer is "of course not - that's for magnetic cartridges!". And to the question of "how accurate is the frequency response of the Strain Gauge cartridge/preamplifier combination for RIAA-standard records?", Mr. Ledermann's response is something like "very accurate, when compared to other [conventional] cartridge/preamplifier combinations."
And his specs are a resonable indication of this . . . because no CARTRIDGE in the world as anything near the +/- 0.1dB response accuracy that Raul is (justifiably) proud of in his preamplifier design.
I believe that Raul's main point is that when there is a presumed, widely-adopted technical standard such as the RIAA equalisation curve (or 75uS FM de-emphasis, or the 200nW/M cassette Dolby level) that any deviation from that standard, makes for an inherently inaccurate reproduction. I in general agree with him, including as it pertains to a "statement"-level phono preamplifier intended for magnetic/dynamic cartridges . . . most errors here are the result of cost-cutting, poor engineering, and poor manufacturing tolerances. Raul's specification of +/- 0.1 dB error for this particular part of the reproduction chain is definately attainable, and quite arguably expectable given the high prices of many commercial units.
I also think that when a manufacturer claims something like "don't worry about that [common-practice technical standard] . . . it's not necessary with my [new, better, revolutionary] design" -- this is the grounds for the deepest skepticism. But in the PARTICULAR case of the Strain Gauge cartridge . . . there may be some technical/historical validity to this claim . . . and unfortuneately, Mr. Ledermann doesn't do a very good job explaining the subject in his literature, or in his posting . . . so I'm going to be presumptious and give it a try:
Historically, the technical standards for the RIAA curve (318uS, 3180uS, and 75uS) come from an amagalm of industry practices from the major record manufacturers, as recorded music (and high-fidelity sound) became more and more popular after World War II. At the time, the methods of cutting records were carried over from the earilest days of electronic recording before the War. In these early days, most electrical phono cartridges were crystal types, there was NO EQUALISATION used on playback. Thus, the engineers developed electrical pre-equalisation practices for cutting records, that were designed to give the best sounding/most accurate reproduction they could, with the types of pickups (both electrical and acoustical) that most people were using.
Enter the 1950s . . . and "magnetic" phono cartridges (moving-magnet and moving-coil) became the dominant high-performance type available, as a result of refinement of manufacturing techniques, and improved magnetic materials developed during the War (i.e. Alnico). The only caveat was that these magnetic cartridges respond to the VELOCITY of the stylus movement, rather than the AMPLITUDE of the stylus movement, as crystal types had. Because of thus, magnetic cartridges REQUIRED EQUALISATION in the preamplifier. (Crystal cartridges were replaced by "ceramic" types that still didn't require equalisation - and continued as the common type for cheaper equipment until the Digital Revolution [sic] of the 1980s.)
This change in standards between phono cartridges and preamps wasn't a big deal at the time (late-1940s/early-1950s), because most of the public was buying integrated phono/tuner/preamp/amp/loudspeaker console sets. Audio dorks were buying units like the McIntosh C8 and Marantz Model 1, with highly-customizable equalisation options. It wasn't until the late-1950s/early-1960s, with magnetic cartridges being the universal choice for high-quality systems, and this new "stereo" thing on the horizon, that there was a need for true standardisation of recording/reproducing equalisation. So the RIAA set a standard that was a pretty close "midpoint" to the current equalisation practices of the major record manufacturers . . . and these practices were refinements of the techniques developed for crystal cartridges.
So the hi-fi world went completely to magnetic (velocity-sensitive) cartridges with standard equalisation for recording and playback, and ceramic (amplitude-sensitive) stayed around, with no playback EQ. This worked well, because the average frequency response of a CARTRIDGE was, in general, MUCH less accurate than the electronic equalisation found in an average stereo . . . and ceramic cartriges were "close enough" for the cheap systems in which they were found.
But today, we're all trying to get that absolute last "n-th degree" of performance from our turntables and record collections, and we have more high-quality cartridges (both new and old) to choose from than ever . . . so you have guys like Raul that feel the only way to truly hear what a cartridge and tonearm have to offer, is to get the electronics so precise that they're completely out of the picture. I agree with him, and applaud him for his efforts.
BUT we also have a guy like Peter Ledermann, who has developed a strain-gauge cartridge system . . . and this system is AMPLITUDE-sensitive, NOT velocity-sensitive. And this puts the strain-gauge system in a unique position, because it may actually be the first time in over fifty years that there has been serious development in ANY ultra-high-performance, amplitude-sensitive phonograph pickup. And of course, the concept of "absolutely correct RIAA equalisation" is completely non-sequitur to an amplitude-sensitive cartridge . . . and this has ALWAYS been the case . . . we've just all forgotten.
So to the question of, "does the Strain Gauge system feature extremely accurate RIAA compenstion?", the answer is "of course not - that's for magnetic cartridges!". And to the question of "how accurate is the frequency response of the Strain Gauge cartridge/preamplifier combination for RIAA-standard records?", Mr. Ledermann's response is something like "very accurate, when compared to other [conventional] cartridge/preamplifier combinations."
And his specs are a resonable indication of this . . . because no CARTRIDGE in the world as anything near the +/- 0.1dB response accuracy that Raul is (justifiably) proud of in his preamplifier design.