advertisement


Do amplifiers really sound the same?

Status
Not open for further replies.
Your macro lens may, in fact, be less sharp than your other one. The only way to be sure is to check using the line charts which are readily available, and use the same lighting for all. The other problems are that a contrasty lens can appear sharper than one with less, and the f stop also has a marked affect, particularly with edge sharpness. There's also the issue of a tripod, which has to be used for any kind of comparison between lenses.

It's easy to measure the parameters of any lens, but looking at an enlarged image of a snapshot on your computer may lead to a misleading conclusion.
 
Detail vs Transparency

Until we bought our first HDTV recently, I thought the 100Hz display of our Loewe CRT was the bee's knees. Does the HDTV make the Loewe less transparent?

I have a macro lens that can capture the same image and perspective as a standard lens, but upon close-up examination, it's evident that the macro lens produces a sharper image. Does that mean the macro lens is more transparent?

This is pertinent to me, because I'm still arguing the toss between a standard display vs Retina on a new MBP. I don't think I can see the difference (with presbyopia) but it'd make me feel sooo much better to have a more 'transparent' and higher 'spec' computer. :D

The whole subject of TV resolution is actually a pretty good way of explaining the concept of "audible" transparency in amplifiers.

The industry would have you believe that you MUST have a full HD TV with 1080 lines of resolution. What they don't tell you, (and quite likely don't actually understand), is that our eyes do actually have a limit to their resolving power and there is an absolute limit to the minimum arc they can resolve. What this means is that at any given distance humans can only see objects above a certain size, (yes it does vary person to person but there is still an ultimate limit). Correlate this to the fact that for a resolution of 1080 lines the physical pixels will need to be a particular size for any given size of TV, larger TV equates to larger pixels, or more importantly distance between lines.So the result is that for any given viewing distance there is a minimum spacing between lines required for humans to be able to resolve them leading to there being a minimum size TV that is neccessary to meet this spacing. Ergo any TVs smaller than the minimum size the resolution is actually not visible and so you get to a point where it would be physically impossible to resolve the difference between a TV with 720lines of resolution and one with 1080.

In much the same way that there is a limit to the acuity of human hearing and so "resolution" below that point is totally invisible to us.
 
The whole subject of TV resolution is actually a pretty good way of explaining the concept of "audible" transparency in amplifiers.

The industry would have you believe that you MUST have a full HD TV with 1080 lines of resolution. What they don't tell you, (and quite likely don't actually understand), is that our eyes do actually have a limit to their resolving power and there is an absolute limit to the minimum arc they can resolve. What this means is that at any given distance humans can only see objects above a certain size,

Indeed, and it doesn't help that many using high resolution displays are feeding them lossy compressed video!
 
Did you see the gorilla? An interesting film about perception that transcends picture resolution.
We miss loads, are inconsistent, tend to focus on particular aspects of a soundscape/visual sequence then being surprised when we see/hear something previously unnoticed-Something I particularly enjoy about hifi.
In telecine transfers we used to do an Ad transfer for cinema in Spirit at (about)2000 lines resolution-since uprated to 4K.
 
So in Hi-fi terms is there a difference between transparency and resolution?

I'm assuming something is transparent if it doesn't add additive distortion, but resolution is related to the revealing of fine detail....
 
So in Hi-fi terms is there a difference between transparency and resolution?

I'm assuming something is transparent if it doesn't add additive distortion, but resolution is related to the revealing of fine detail....

Resolution is a function of the source, not the amplifier, assuming the amplifier is reasonably linear & transparent. Loss of resolution would measure as distortion.

Chris
 
Correlate this to the fact that for a resolution of 1080 lines the physical pixels will need to be a particular size for any given size of TV, larger TV equates to larger pixels, or more importantly distance between lines.So the result is that for any given viewing distance there is a minimum spacing between lines required for humans to be able to resolve them leading to there being a minimum size TV that is neccessary to meet this spacing. Ergo any TVs smaller than the minimum size the resolution is actually not visible and so you get to a point where it would be physically impossible to resolve the difference between a TV with 720lines of resolution and one with 1080.

In much the same way that there is a limit to the acuity of human hearing and so "resolution" below that point is totally invisible to us.
Possibly, but I swear my 55" HDTV on HD content is significantly sharper than the 32" Loewe CRT on SD transmissions. Maybe SD has fewer than 720 lines.
 
Sorry, I'm not explaining myself very well. Let me try again.

When using Audigy, for example, to compare input vs output or one except against another, you get a chart that looks like this.

Which shows amplitude plotted against a time scale. I was contending that it's more insightful if a third dimension is used to illustrate the spectral content (FR and amplitude) against time. Maybe such an analyser is already available; I'm just not that familiar with the technology.
Yes - it's called a spectrum analyser, and is used extensively in amplifier (and other electronics) testing. In formally testing an amplifier, I would expect that you would use the amplifier response to steady state frequencies across the audible spectrum and to impulses, not time varying music. Comparing the input and output waveforms of music by eye using Audigy is not a very insightful way of doing it (but looking at the difference of level matched input and output will show you where one differs from the other, and can give you some idea of performance).
 
Interesting, is it possible for amplifiers to be more or less resolving regardless of their transparency? What's the measure of resolution?
I think we have to be very careful with analogies because it is very easy to strain them beyond their usefulness without noticing where they break down. As I have said before, transparency does not mean the same in audio as transparency means in optics, and I don't think resolution means the same. In audio, both transparency and resolution are metaphoric terms to describe the subjective experience of listening to audio kit whereas in optics they mean something precise and measurable. The analogy is also strained by the fact that resolution in optics refers to the ability of the lens with respect to high spatial frequencies - a lens will have a limit to its resolution (at a particular aperture, field position and focal condition) usually taken to be the first minimum in the MTF - spatial detail finer than that, even if present in the object, cannot be resolved in the image. Competent lenses do vary in their resolving power.

In the case of audio, spatial resolution of a signal through an amplifier is meaningless and it is the temporal aspect that matters. I take it that resolution in audio refers to the ability of the system to make fine details of the time varying music signal audible. Since all competent amplifiers will pass all audible frequencies within a few dB, I don't see how one competent amplifier can have higher or lower resolution than another. Speakers are a different matter as they can have resonances, overhang and colourations that mask temporal detail.
 
... I would expect that you would use the amplifier response to steady state frequencies across the audible spectrum and to impulses, not time varying music.
Steady state and impulse spectra represent amplifier performance about as much as an MLSS graph does for loudspeaker performance. Are there no spectrum analysers that compare input and output of real time music (adjusted for transmission delay, if necessary) to show spectral differences?
 
Your macro lens may, in fact, be less sharp than your other one. The only way to be sure is to check using the line charts which are readily available, and use the same lighting for all. The other problems are that a contrasty lens can appear sharper than one with less, and the f stop also has a marked affect, particularly with edge sharpness. There's also the issue of a tripod, which has to be used for any kind of comparison between lenses.

It's easy to measure the parameters of any lens, but looking at an enlarged image of a snapshot on your computer may lead to a misleading conclusion.
Agreed, but contrast at any frequency is a strong function of the resolving power (aka sharpness) of the lens at that frequency as well as reductions in contrast caused by scattered and reflected stray light. Sharpness and contrast are not independent parameters in lenses, unless we are referring to contrast as the spatial frequency approaches zero. And the total performance is captured by the MTF which is a function of image forming aberrations, diffraction and by stray light.
 
In the case of audio, spatial resolution of a signal through an amplifier is meaningless and it is the temporal aspect that matters. I take it that resolution in audio refers to the ability of the system to make fine details of the time varying music signal audible.
Would a wide bandwidth amplifier that is inherently more accurate with square-wave reproduction resolve timing differences better than one that is bandwidth limited, all other parameters being the same?
 
Possibly, but I swear my 55" HDTV on HD content is significantly sharper than the 32" Loewe CRT on SD transmissions. Maybe SD has fewer than 720 lines.

Standard Def has 575 lines, (480 in North America). 720 lines is the lowest resolution accepted as being HD. SD was never intended to be viewed on such large screens as are common today. Back in the day an average TV screen size was about 12-14". I remember thinking 17" screens were big and 21" as huge, (21" 4:3 TV would be 26" Widescreen for the same height). On those kind of screen sizes SD looks good enough, with no apparent loss of detail, because at normal viewing distances you couldn't distinguish a higher resolution anyway.

Here is a link to viewing distance and resolution. Remember though it assumes 20:20 vision so, (if like me) your normal, or corrected vision is better you'd achieve the same but further away.

http://s3.carltonbale.com/resolution_chart.html

As you can see for a 26" (21" normal aspect) TV SD resolution becomes as good as it gets, (i.e any higher would be pointless) at a distance of around 7.5ft, (that's for 480 US SD resolution - UK would be more like 6ft). So basically fitting in well with average UK viewing distances.



Robert, a good point.. I've often wondered why no TV manufacturer produces a model with a "Zoom out" function.. to be able to make SD pictures smaller in order to keep the quality limitations caused by the lack of resolution and poor encoding to a minimum.
 
http://s3.carltonbale.com/resolution_chart.html

As you can see for a 26" (21" normal aspect) TV SD resolution becomes as good as it gets, (i.e any higher would be pointless) at a distance of around 7.5ft, (that's for 480 US SD resolution - UK would be more like 6ft). So basically fitting in well with average UK viewing distances.
Thanks for the link. That pretty much makes the 2,880 x 1,800 resolution of the 15.4" MBP Retina a bit pointless at normal lappy viewing distances. Or does it?
 
Steady state and impulse spectra represent amplifier performance about as much as an MLSS graph does for loudspeaker performance.
Agreed - if properly carried these tests (and others like swept tones) comprehensively characterise the performance of amplifiers and the anechoic performance of speakers respectively (you have to be careful about non-linearities that produce distortion in the MLS for example and the most accurate way to characterise the anechoic performance of a speaker is using an anechoic chamber)

Are there no spectrum analysers that compare input and output of real time music (adjusted for transmission delay, if necessary) to show spectral differences?
Of course an analyser will show the frequency content of any signal you care to put in and can show the spectral content of input, output and difference signals from an amplifier.

I would say that characterising amplifier performance using music signals is as useful as using images of the countryside or portraits of your aunt is in characterising lens performance - ie, not very.
 
I would say that characterising amplifier performance using music signals is as useful as using images of the countryside or portraits of your aunt is in characterising lens performance - ie, not very.
Of course, but I was merely trying to ascertain if there was a way that a spectrum analyser could be used to identify differences people hear between different amplifiers or to rule out imagined differences.
 
Status
Not open for further replies.


advertisement


Back
Top