It depends what you actually mean by 'amplifier Watts'. If you mean how much power the amp is generating then no this method will not be accurate. A speaker is a reactive load and unlike a resistor the current and voltage can go out of phase known as the speakers phase angle. This can be the destroyer of lesser build amps especially BJTs. As the phase angle between the voltage and current increases more power is dissipated by the output transistors as heat rather than driving the speaker load. This also varies with frequency. For example at 45 degrees 80% of the amplifier output is dissipated as heat in the output devices and only 20% actually drives the load.Is it simply a case of playing some music at the desired level, measuring the AC voltage across the amp's loudspeaker outputs, followed by this voltage squared, then dividing by the resistance of the loudspeakers (8 ohms)?
I recall an article "explaining" that the reason transistor amplifiers don't sound as good as valve amplifiers is because they are often clipping, and valve amplifiers have a softer limit, and stating that a check with an oscilloscope will demonstrate clipping. But when I checked my amplifier (150watt/ch) typical output level (volts, and converting to power using above formula) into my Harbeth HL5s at normal listening levels, I found peak levels only rarely exceeded a few watts.
The answer is not very! Unless it has been calibrated...I have a sound meter on my phone which shows the db. Don't know how accurate it is.
Could you sit 1 metre away from a speaker with a decibel meter phone app and adjust the volume until the meter reads the same as your speaker sensitivity rating. At that point you'll be using 1 Watt.
Or have I massively misunderstood something?
I have a sound meter on my phone which shows the db. Don't know how accurate it is.
Is there a reason for wanting to find out?