We aren't talking about sweet spots and connecting to TVs. The proposition made by you and a couple of others is that as long as a cable can carry sufficient current then there's no difference in sound. I've just shown you that this isn't the case and pointed you towards graphs that clearly show a difference.to be honest theres only one "sweet spot" on a system so anywhere else is a compromise
iff its connected to a telly then its an even greater compromise with the owners the only sweet spot both for sound and screen positioning
I've only skimmed, but other than some minor and likely unrelated differences at low frequency (let's see the same tests run ten times in ten places and the same curves show up..), all I can see is a simple difference in delivered power.. which is nothing more than a lower impedance at work. And seeing as their '12AWG' cable measured more like 18AWG..
And on that note:
If that were the case then the generic 12 gauge cable at 2.05 sq mm cross sectional area should outperform the QED Silver Anniversary which is a 1.5mm2 cable roughly equivalent to 15AWG, except it doesn't. So how do you explain that then?
12AWG is 3.31mm². You're thinking of 14.
The measured DCR in the article and the supposed cable gauges do not line up (I'm assuming some offset here and simply comparing the relative difference.. and it still doesn't add up). Something is fairly wrong and I don't trust their test setup. Or their '12AWG'.