A ported or transmission line speaker can never be time coherent, nor dipole, bipole or omni, OR SEALED. See my 1/16/03 post, about why any moving system has a natural time delay down at its resonance. The ported or "transmission line" (which is a port variation and NOT a transmission line) designs have the SAME phase shift as a sealed box for the sound leaving the front of the cone, and the port opening's output has additional time delay AND polarity inversion.
For living room use and mix monitoring, I believe point-source design techniques are the best way to achieve fidelity- primarily to avoid hearing time-delayed output from more distant drivers, or from more distant panel regions.
Amplitude linearity is most important when the speakers are of minimum-phase design, as then one can hear small deviations from amplitude linearity. But if the speaker has lots of phase shift, that skews the harmonic structure of the music, which puts those harmonics out of phase and thus alters the perceived timbre of the instrument or voice. Which makes it harder to "accept" what your test microphone is saying is "flat" amplitude response. The warped phase response keeps amplitude deviations from being noticed as much. In those designs, phase and amplitude are not independant parameters. When you remove phase nonlinearities, then amplitude response IS an independant parameter.
Amplifiers have problems- if a speaker has phase shift, it will change the sound of those problems, usually making them worse. It is distorting distortion. The amplifier has its lowest distortion working into a flat impedance curve from the speaker, but many speaker designers try for that by adding extra parts in the crossover- which act to flatten the impedance curve, but reduce clarity. And some of those impedance-flattening techniques do more harm than good- especially the ones for the woofer resonance- what a mistake!
Good questions, Mr. Unsound.
Best,
Roy