Some thoughts on ASR and the reviews


I’ve briefly taken a look at some online reviews for budget Tekton speakers from ASR and Youtube. Both are based on Klippel quasi-anechoic measurements to achieve "in-room" simulations.

As an amateur speaker designer, and lover of graphs and data I have some thoughts. I mostly hope this helps the entire A’gon community get a little more perspective into how a speaker builder would think about the data.

Of course, I’ve only skimmed the data I’ve seen, I’m no expert, and have no eyes or ears on actual Tekton speakers. Please take this as purely an academic exercise based on limited and incomplete knowledge.

1. Speaker pricing.

One ASR review spends an amazing amount of time and effort analyzing the ~$800 US Tekton M-Lore. That price compares very favorably with a full Seas A26 kit from Madisound, around $1,700. I mean, not sure these inexpensive speakers deserve quite the nit-picking done here.

2. Measuring mid-woofers is hard.

The standard practice for analyzing speakers is called "quasi-anechoic." That is, we pretend to do so in a room free of reflections or boundaries. You do this with very close measurements (within 1/2") of the components, blended together. There are a couple of ways this can be incomplete though.

a - Midwoofers measure much worse this way than in a truly anechoic room. The 7" Scanspeak Revelators are good examples of this. The close mic response is deceptively bad but the 1m in-room measurements smooth out a lot of problems. If you took the close-mic measurements (as seen in the spec sheet) as correct you’d make the wrong crossover.

b - Baffle step - As popularized and researched by the late, great Jeff Bagby, the effects of the baffle on the output need to be included in any whole speaker/room simulation, which of course also means the speaker should have this built in when it is not a near-wall speaker. I don’t know enough about the Klippel simulation, but if this is not included you’ll get a bass-lite expereinced compared to real life. The effects of baffle compensation is to have more bass, but an overall lower sensitivity rating.

For both of those reasons, an actual in-room measurement is critical to assessing actual speaker behavior. We may not all have the same room, but this is a great way to see the actual mid-woofer response as well as the effects of any baffle step compensation.

Looking at the quasi anechoic measurements done by ASR and Erin it _seems_ that these speakers are not compensated, which may be OK if close-wall placement is expected.

In either event, you really want to see the actual in-room response, not just the simulated response before passing judgement. If I had to critique based strictly on the measurements and simulations, I’d 100% wonder if a better design wouldn’t be to trade sensitivity for more bass, and the in-room response would tell me that.

3. Crossover point and dispersion

One of the most important choices a speaker designer has is picking the -3 or -6 dB point for the high and low pass filters. A lot of things have to be balanced and traded off, including cost of crossover parts.

Both of the reviews, above, seem to imply a crossover point that is too high for a smooth transition from the woofer to the tweeters. No speaker can avoid rolling off the treble as you go off-axis, but the best at this do so very evenly. This gives the best off-axis performance and offers up great imaging and wide sweet spots. You’d think this was a budget speaker problem, but it is not. Look at reviews for B&W’s D series speakers, and many Focal models as examples of expensive, well received speakers that don’t excel at this.

Speakers which DO typically excel here include Revel and Magico. This is by no means a story that you should buy Revel because B&W sucks, at all. Buy what you like. I’m just pointing out that this limited dispersion problem is not at all unique to Tekton. And in fact many other Tekton speakers don’t suffer this particular set of challenges.

In the case of the M-Lore, the tweeter has really amazingly good dynamic range. If I was the designer I’d definitely want to ask if I could lower the crossover 1 kHz, which would give up a little power handling but improve the off-axis response.  One big reason not to is crossover costs.  I may have to add more parts to flatten the tweeter response well enough to extend it's useful range.  In other words, a higher crossover point may hide tweeter deficiencies.  Again, Tekton is NOT alone if they did this calculus.

I’ve probably made a lot of omissions here, but I hope this helps readers think about speaker performance and costs in a more complete manner. The listening tests always matter more than the measurements, so finding reviewers with trustworthy ears is really more important than taste-makers who let the tools, which may not be properly used, judge the experience.

erik_squires

@deep_333 - I get you, thanks for your post : ) - but I will always engage those who either don’t prevaricate, or at least acknowledge when it is being done. More vitally, the hope is my posts are being read by those on the fence, you see, who might be persuaded that the relationship between empiricism and rationalism is required for better knowledge and informed choices, and not just one over the other…..and certainly not measurements as final arbiter.

But it is an uphill task facing the indoctrinated. One of them from oakcreek doesn’t even know the definition of high fidelity, and has been so brainwashed by asr into believing it is about signal integrity, that he hasn’t even bothered to look up or study what high fidelity is, or where the term came from.

My hope is that others will.

i haven’t yet lost hope for markwd, but it will take a while to revert.

@markwd, thank you for your reply - I hope you have the patience for my response - a touch busy today : )

 

In friendship - kevin

@kevn Take your time! I just got back from the symphony and got caught up myself. I just do this for fun anyways. I have neither ego nor business nor status invested in this topic but do find it fascinating, like friscalating light through a dusty chandelier.

@markwd thanks for that - ok then, what first needs to be established is a datum we can both agree on. The gist of this datum is the tradeoff made over measurements - this is most clearly established in the Heisenberg principle of uncertainty, and it’s somewhat equivalent in acoustics, the Fourier uncertainty principle.


What the Heisenberg principle of uncertainty says is, at the scale of quantum mechanics, it is not possible to accurately measure two related physical properties of a particle simultaneously. This is to say that if the velocity of a particle can measured accurately, there will be doubt regarding its precise location, and vice versa. If that doesn’t already sound bells in your head, take a look at the Fourier uncertainty principle which limits the precision with which the simultaneous measurement of both duration and frequency of a sound can be made.

The full article is here https://phys.org/news/2013-02-human-fourier-uncertainty-principle.html - originally posted by one of our learned, if a touch longwinded members, mahgister.

The article establishes that among a group of mixed participants and careful listening tests, that the human cochlea, non-linear as it is, equalled or outperformed the limit set by the Fourier uncertainty principle, in one case of both frequency and timing scores, by a factor of 13. The top score for acuity in timing was for three milliseconds. 

That’s three thousandth of a second. 

If you will allow this datum as a qualified test of human hearing acuity, we can proceed with the conclusions for the test - that our hearing is not only capable of performance equalling that of technical instrumentation in relation to independent measurements of frequency or time, but it exceeds that of the simultaneous measurement of both frequency and time, limited as testing equipment is, by the Fourier uncertainty principle - this is vital, as the foundations of music itself are built on the simultaneity between frequency and the time domain.

I hope you better understand now why Amir’s are not everything - he cannot accurately measure both frequency and timing, the very tenets of music, at the same time. He argues against the proven science of the Fourier uncertainty principle if he claims he can.

In relation for electromagnetism, which you relevantly queried me over explanations being left incomplete - you know markwd, never mind the Fourier uncertainty principle limiting the simultaneous measurement of frequency and time; never mind why I hear realism that measurements cannot explain; the profound world of electromagnetism is still beyond the full understanding of rational science itself.

My intention was not to answer any questions regarding the relationships between magnetic flux and the audio signal. I wouldn’t have a clue! I only have the hypothesis of its absolutely importance, being so much a part of how electricity itself is transmitted. It was only vital that I got you interested in the question, because that is truly what science is all about: what part does magnetic flux play in audio signal transmission that we might be missing?

Science is as much the asking of empirical questions, as it is the delivery of rational answers. I hope you get more involved in all of science, and not just its rational side : )

 

In friendship - kevin 

 

@markwd ”But are you worried that the imprimatur might give new audio equipment "seekers" some kind of false belief that all they need is this particular kind of ASR science? I’m not too worried!”

 

Sorry I missed this, markwd. The thing is, when I started my audiophile journey, I knew nothing. Like most others. It would have been so easy to be sucked into the convenience of measurements to justify all my purchases, as anyone wanting to get best value for dollar would. However, i have never been one to take the easy road, and found it necessary to first understand the multiple viewpoints of any one issue, and then the relationships between them all for balanced decision making. It takes immense effort to slowly build that comprehension of what hifi audio is about, and not an undertaking most would want to see through.

What got me truly started was the definition of high fidelity.

I realised most of us start our journey with a misconception of what the term means. The exact origins of the expression will probably never be known, but it is generally agreed that its usage was first seen in Billboard magazine back in 1933. Even before that however, the obsession existed to recreate the sound experience of live music through the recorded medium.

High Fidelity has always been about the reproduction of high quality sound through electrical equipment to be as similar as possible to the original sound.

Back then, distortion was so completely everywhere, it wasn’t even an issue - all that mattered was a medium that could just deliver some semblance of realism to the reproduced sound. They found it in vinyl. However awful those needles were back then (and they were practically big ugly heavy crappy needles) that semblance was reached. Everything thereafter became one long road of refinement to bring sound reproduction closer and closer to the original sound. To make sound reproduction more realistic. The absolute fidelity of the signal was never a goal - what mattered was how the reproduced sound compared to the original sound.

We turned high fidelity into signal fidelity at some point.

It really was to help make things, decisions, easier; to have some quantifiable and rational basis for commonality and reference. And so, our original reference that was the original sound, became marginalised, and for many, forgotten. We don’t need to make the effort to improve listening ability any more - the measurements did it for us.

Except that it didn’t - particular kinds of distortion, in fact, bring reproduced sound closer to the original sound. The manufacturers discovered that tiny relevant imperfections in the audio signal help create a closer approximation of realism. High Fidelity is this incredibly nuanced kitchen of finding that balance between eliminating damaging distortion from, and introducing relevant imperfection to the signal to bring reproduced sound closer in realism to its original sound source.

Heresy, if opined from the viewpoint of signal fidelity, but hey, signal fidelity has never been what high fidelity is about.

This is what makes the arguments that amir or any one else whip up from their measurements seem so silly, they’re arguing for pure signal fidelity in a hobby where its accomplishment defeats the entire purpose of high fidelity.

So yes, it is false belief and indoctrination I post against, because I want every audiophile to hear the amazing realism that is possible with the wonderful equipment that is out there, which they can only access when they develop the difficult and time consuming skill of listening and hearing ability.

 

In friendship - kevin

@kevn I previously addressed the issue of these hearing-excess-of-Fourier arguments as well as heterodyning and nonlinear effects within the ear. The problem isn't that there are interesting experimental results, it's that they don't demonstrate that there is anything that can be done to audio equipment to implement better solutions to whatever gaps may be present. For instance, if I am a DAC designer there are several different pathways to accurately reproduce a signal but there is no theory that says one approach will improve over another in matching the nonlinear merging properties of higher and lower frequencies in the cochlea.

Now, you can suggest that somehow listening on the part of the designer is allowing them to choose between design pathways but this is just speculation. It may be true, as I noted to @mahgister, but we don't know and neither does the designer.

So there is a certain faith built into all this speculation, just like god-of-the-gaps arguments in other online communities ("listening-in-the-gaps" arguments has a nice ring to it!). It's interesting but needs proof and a proper measurement methodology that shows a path forward for determining exactly how these phenomena impact equipment design and use.

Since you are a bit of a student of ideas in philosophy of science, one key one in contemporary thinking on the topic is lifted from Wittgenstein that we must remain silent on things we have no knowledge of and we have no knowledge of this. Until we develop it sufficiently we do have an AP and spectral sweeps.