advertisement


It’s about time we had another cable thread.

Perhaps a laser reflecting from the cone, the distance being recorded at tiny intervals of time (nanoseconds?) Then play the same track with different leads. A simple computer comparison of plots would show variations or not. This is a bit like zooming into a digital photo when side by side comparisons are irrefutable.
I don't understand why this hasn't been done, I can only assume we haven't got a high end hifi buff who is also a scientist. One thing I would stipulate though is that the measurements must be at nanosecond level, to remove all doubt or argument.
Measuring cone movement won't work. Cone materials are very sensitive to temperature and humidity. The speaker motor and cone are affected by history, what has been played recently.
Measuring voltage across the speaker terminals is more promising
 
All that needs to be done is to measure a difference between cables, inconnects, leads..whatever. If there is a difference, end of story, it's then a subjective opinion which is better.
hasn't it? With Audio Diffmaker? And no it isn't the end of the story.
How to measure a difference? Because what we hear comes purely from the back and forth movements of the drivers, we need to plot those movements.
Perhaps a laser reflecting from the cone, the distance being recorded at tiny intervals of time (nanoseconds?) Then play the same track with different leads. A simple computer comparison of plots would show variations or not. This is a bit like zooming into a digital photo when side by side comparisons are irrefutable.
I don't understand why this hasn't been done, I can only assume we haven't got a high end hifi buff who is also a scientist. One thing I would stipulate though is that the measurements must be at nanosecond level, to remove all doubt or argument.
No. How would measuring with greater precision than the power amps output help? Or below the thermal noise of the loudpeaker. And why would we measure any further down the chain than the output of the cable? Or the electrical output of the first device down the chain. Perhaps in the case of a power lead I spose, but on your argument if a difference is a difference than just measure the difference. If you read Archimago's blog you may well find what you are looking for.

And are any two analog outputs exactly the same? In short the answer is no. IIRC you can generally only get analog outputs to null at something like 80dB. There always will be some difference.

So yes the answer always lies in whether the difference between two things is
a) capable of being attributed to the wire rather than other components, the limits of the measuring system or random variation
b) capable of being heard. This is in old fashioned terminology a subjective test. But it is not a matter of opinion.
 
It just makes sense to me to measure what the end result is (the sound) without interference.
Measuring at the speaker cables could affect the results by the act of measurement, or even add noise from the measuring instrument. The act of driving the cones would likely affect the amplifiers output. Light beaming on a cone seems like a good way of measuring without interfering.
Doing the same test without changing anything would give a 'random fluctuation' level. I would expect this would be very very small if the tests are done back to back. Changes beyond that level could be noted.
 
I am looking forward to all those long overdue Nobel Prizes in Physics finally awarded to the people in the audiophile business...

Without doubt there is so much audiophool stuff out there it almost defies belief. However, there's got to be something in it IMHO, judging by my experiences of using LFD gear.
 
So as a result of discovering your test setup was faulty you have totally given up on any attempt at reliable testing and instead decided to follow your subjective preference?
well i dont have access to reliable ways of testing... worst case im out 400 bucks in my system for the audioquest water
 
Interestingly it seems like Colin Wonfor is especially interested in phase distortion when designing his cables.

Our measurements are dominated by IMD & THD measurements, crude LCR measurements etc., but very little insight into minute phase distortions at 20hz to 20khz, you know, with real music, and real people (who are sensitive to different things at different frequencies), rather than looking at frequency sweeps and sine waves...

For a long time, it was generally considered that relative phase is not audible, which is of course, untrue:

http://www.silcom.com/~aludwig/Phase_audibility.htm
 
IIRC digital interconnects have been measured by one of the HiFi mags and the expensive cables generally introduced much less jitter. It therefore seems very likely that analogue interconnects introduce different time domain distortions. Whether these time domain differences are audible can only be a hypothesis.

Nic P
 
For a long time, it was generally considered that relative phase is not audible, which is of course, untrue:

http://www.silcom.com/~aludwig/Phase_audibility.htm

To quote from that link:

There is extensive data that indicates that under normal listening conditions, with real music, even experienced listeners have great difficulty perceiving phase effects. To quote from a recent survey paper by a top engineer at Harman International, Dr. Floyd Toole: "It turns out that, within very generous tolerances, humans are insensitive to phase shifts. Under carefully contrived circumstances, special signals auditioned in anechoic conditions, or through headphones, people have heard slight differences. However, even these limited results have failed to provide clear evidence of a 'preference' for a lack of phase shift. When auditioned in real rooms, these differences disappear.."(The piano note demo shows that if you accept a broad definition of "contrived" you can get more than a "slight" difference).

As discussed in other sections I have done a lot of listening tests comparing 1st and 4th order crossovers, which have very different phase responses. Arny Krueger provides test files and a double-blind test program and you can do these tests yourself. In contrast to the artificial phase modification used for the piano notes, these have very realistic phase responses. I didn't find any differences that could not be attributed to amplitude. If there are audible differences due to typical crossover phase responses, they are pretty subtle, and in my opinion minor compared to other problems in the reproduction of music.

In any case, any slight phase errors caused by a cable are totally overshadowed by those caused by the loudspeakers.
 
Well two wrongs never make it right, and we are talking no huge phase changes as in a speaker or in a room but tiny edge delays etc.
 
And yet few if any in other fields are conducted in a way that resembles double-blind ABX testing in audio.

"Here, take this drug. Now, take this placebo. Now, take this unidentified pill, and tell us if it's the drug or the placebo."

Many years ago I did a drug trial (for a new antidepressant) that looked a bit like that! Of course, it was a bit more technical and we called it a “double blind, double dummy, placebo controlled crossover study”. Stripped to the basics, your description is pretty close,
 
IIRC digital interconnects have been measured by one of the HiFi mags and the expensive cables generally introduced much less jitter. It therefore seems very likely that analogue interconnects introduce different time domain distortions. Whether these time domain differences are audible can only be a hypothesis.

Nic P
I challenge you to find that article. Any digital cable should be agnostic, if its not its not fit for duty, such as using rca cable for spdif transmission.
 


advertisement


Back
Top