JensenHealey
pfm Member
Nano seconds are unnecessary. Ears do not work that quickly!
Well might as well do it at the highest possible resolution to avoid all doubt.Nano seconds are unnecessary. Ears do not work that quickly!
No, got to measure the actual output. Measuring those things will not prove anything.Or you can just measure the electrical parameters of the cables, L,C,R.
Keith
Measuring cone movement won't work. Cone materials are very sensitive to temperature and humidity. The speaker motor and cone are affected by history, what has been played recently.Perhaps a laser reflecting from the cone, the distance being recorded at tiny intervals of time (nanoseconds?) Then play the same track with different leads. A simple computer comparison of plots would show variations or not. This is a bit like zooming into a digital photo when side by side comparisons are irrefutable.
I don't understand why this hasn't been done, I can only assume we haven't got a high end hifi buff who is also a scientist. One thing I would stipulate though is that the measurements must be at nanosecond level, to remove all doubt or argument.
hasn't it? With Audio Diffmaker? And no it isn't the end of the story.All that needs to be done is to measure a difference between cables, inconnects, leads..whatever. If there is a difference, end of story, it's then a subjective opinion which is better.
No. How would measuring with greater precision than the power amps output help? Or below the thermal noise of the loudpeaker. And why would we measure any further down the chain than the output of the cable? Or the electrical output of the first device down the chain. Perhaps in the case of a power lead I spose, but on your argument if a difference is a difference than just measure the difference. If you read Archimago's blog you may well find what you are looking for.How to measure a difference? Because what we hear comes purely from the back and forth movements of the drivers, we need to plot those movements.
Perhaps a laser reflecting from the cone, the distance being recorded at tiny intervals of time (nanoseconds?) Then play the same track with different leads. A simple computer comparison of plots would show variations or not. This is a bit like zooming into a digital photo when side by side comparisons are irrefutable.
I don't understand why this hasn't been done, I can only assume we haven't got a high end hifi buff who is also a scientist. One thing I would stipulate though is that the measurements must be at nanosecond level, to remove all doubt or argument.
I am looking forward to all those long overdue Nobel Prizes in Physics finally awarded to the people in the audiophile business...
Alien technology?
Keith
well i dont have access to reliable ways of testing... worst case im out 400 bucks in my system for the audioquest waterSo as a result of discovering your test setup was faulty you have totally given up on any attempt at reliable testing and instead decided to follow your subjective preference?
For a long time, it was generally considered that relative phase is not audible, which is of course, untrue:
http://www.silcom.com/~aludwig/Phase_audibility.htm
There is extensive data that indicates that under normal listening conditions, with real music, even experienced listeners have great difficulty perceiving phase effects. To quote from a recent survey paper by a top engineer at Harman International, Dr. Floyd Toole: "It turns out that, within very generous tolerances, humans are insensitive to phase shifts. Under carefully contrived circumstances, special signals auditioned in anechoic conditions, or through headphones, people have heard slight differences. However, even these limited results have failed to provide clear evidence of a 'preference' for a lack of phase shift. When auditioned in real rooms, these differences disappear.."(The piano note demo shows that if you accept a broad definition of "contrived" you can get more than a "slight" difference).
As discussed in other sections I have done a lot of listening tests comparing 1st and 4th order crossovers, which have very different phase responses. Arny Krueger provides test files and a double-blind test program and you can do these tests yourself. In contrast to the artificial phase modification used for the piano notes, these have very realistic phase responses. I didn't find any differences that could not be attributed to amplitude. If there are audible differences due to typical crossover phase responses, they are pretty subtle, and in my opinion minor compared to other problems in the reproduction of music.
And yet few if any in other fields are conducted in a way that resembles double-blind ABX testing in audio.
"Here, take this drug. Now, take this placebo. Now, take this unidentified pill, and tell us if it's the drug or the placebo."
I challenge you to find that article. Any digital cable should be agnostic, if its not its not fit for duty, such as using rca cable for spdif transmission.IIRC digital interconnects have been measured by one of the HiFi mags and the expensive cables generally introduced much less jitter. It therefore seems very likely that analogue interconnects introduce different time domain distortions. Whether these time domain differences are audible can only be a hypothesis.
Nic P