Directional speaker cables - switching direction


Some time ago I started a thread regarding speaker wire directionality and my inability to understand how it could have any affect on sound quality. The question was inspired by the fact that, after quite a few years using them with my Martin Logan Odysseys, I discovered that the cables (Straightwire Octave 2) had arrows printed on them. Not surprisingly the opinions expressed were pretty strong on both sides of the argument but those supporting directionality were the most vociferous and in greater numbers, one to the point of being downright insulting. In no case, though, was an explanation given by those supporting the importance of cable direction for how this phenomenon occurs except that it should be obvious that when a cable is broken in in one direction only someone with an uneducated ear would be unable to discern the difference.

Even though I still don't get it I'm not taking the position that there is no validity to the directional claim; if there truly is I just don't understand how. This leads me to my two part question. I haven't been using the Octaves for a few years but now, because of cable length issues, I want to put them back in my system partly to avoid the cost of new quality cables.

IF, then, the directionality theory IS valid and I don't recall which way the arrows originally pointed or which direction they were "broken in" do those in support of directionality think I should install them with the arrows pointing toward the speakers
128x128broadstone
Some cables are designed to go in one direction due to their cable construction. I had an IC that had the shield attached to the ground at the source end and open at the other end.

To end the debate, just insert the cables both ways and keep the one that sounds the best.
I think the configuration of any shielding has potential to be directional, but this seems like it should be more of an interconnect issue as most speaker cables are not shielded, right? Maybe the geometry of the speaker cable is such that it isn't identical in both directions in the same way that tread on directional tires is different. Like you, I can't imagine a scenario where the break-in would matter or how direction would matter in a simple cable design such as basic monster cable or my 10 AWG Blue Jeans Cables.
I can't imagine a scenario where the break-in would matter or how direction would matter in a simple cable design such as basic monster cable or my 10 AWG Blue Jeans Cables.
Mceljo
One way to find out for sure, flip your 10 AWG Blue Jeans Cables end for end and listen for any difference. Post back your results.
One way to find out for sure, flip your 10 AWG Blue Jeans Cables end for end and listen for any difference.
And then flip them back and forth at least two and preferably three or more times, not only to verify that your perceptions are consistent, but that the perceived difference (if any) is not the result of an extraneous variable (for example, changes in contact integrity, changes in AC line voltage or noise conditions, changes in room temperature or humidity, etc). Re humidity, see the post by Georgelofi dated 6-17-14 in this thread.

IMO, the less explicable a perceived difference is, the more thorough the assessment needs to be, before concluding that the difference has been attributed to the right thing.

Brf makes a good point, btw, that many shielded cables, especially interconnects, are designed asymmetrically and can certainly be expected to be directional. In those cases the end at which the shield is connected should generally be connected to the component which drives the cable.

Regards,
-- Al