advertisement


Amplifier input sensitivity

Nigel

pfm Member
If an amplifier's line level input sensitivity is 200mV is there a voltage range where beyond that, you could cause the amp to operate outside it's normal parameters? Or is it solely a matter of gain?

An example being a CD player output of 2v. Is this the voltage the 200mV amp input prefers to receive? If we take an integrated amp, turn the volume to max, connect a preamp to the integrated's input, as the pre's volume increases, the output voltage presumably would, I imagine a fair bit higher than 2v. Is there a point where the voltage reached started causing other issues, other than just being simply too loud?
 
If an amplifier's line level input sensitivity is 200mV is there a voltage range where beyond that, you could cause the amp to operate outside it's normal parameters? Or is it solely a matter of gain?

An example being a CD player output of 2v. Is this the voltage the 200mV amp input prefers to receive? If we take an integrated amp, turn the volume to max, connect a preamp to the integrated's input, as the pre's volume increases, the output voltage presumably would, I imagine a fair bit higher than 2v. Is there a point where the voltage reached started causing other issues, other than just being simply too loud?

Input sensitivity 200 mV means that is the signal required to produce full output. Going beyond that, with volume turned up, results in clipping (heavy distortion). Depending on where in the circuit path the clipping occurs, turning down the volume might or might not help.
 
Why have the input sensitivity at 200mV ? Wouldn't it be better to have it higher? Why not have it the same as the CD player output voltage?
 
Its a compromise. The assumption is that you'll want to normally have the volume control set well *below* its max rotation, so you can wind it up when the source level is unusually low. But also have room to turn it down for very loud inputs.

2V is the max possible from a typical CD player, etc. Having a sensitivity of 200mV tends to mean a volume control setting that is somewhere lower than 'half way'. But add in variations in speaker sensitivity and power amp gain/power and this is a moveable feast. So no values will suit all cases. Hence a compromise. In any *specific* case a higher or lower value might be preferred. But the maker can't know that when he makes each set.

Logically, it would make sense for a set of standard 'attenuator' box/cable items to be on sale so people can add one if the system they have requires an awkward volume control setting. But for whatever reason there is no standard market for this. That said, maybe because the usual CD output and amp input values are about right in most cases.

All that said, I certainly tend to attenuate levels. But also use a mix of ancient and newer kit, and only play at quite low output powers.
 
Why have the input sensitivity at 200mV ? Wouldn't it be better to have it higher? Why not have it the same as the CD player output voltage?

There is a huge output difference between source components, e.g. compare say a Quad FM3 tuner with a typical modern digital source, the Quad is a tiny fraction of the level. The problem for the amp/preamp maker is their product has to deal with both extremes. Annoyingly this means the usable volume control range is only a couple of ‘hours’ with CD and over about 11 o’clock is likely close to clipping the power amp stage. I also have a suspicion that many amp designers view their purchasers as being dumb enough to equate volume knob position with available headroom, which couldn’t be further from the truth with a modern 2V+ source.

Also bare in mind that the whole 2V+ thing is a comparatively modern thing. Back in the 50s-70s the only thing you’d find pushing that sort of voltage out would be a high-speed studio tape machine, everything else would be in the mV range somewhere. This being why it is often hard to connect CD players to vintage preamps without using inline attenuators.

As ever the problem with audio is the lack of formal standards and need for backward-compatibility, e.g. I still find it incomprehensible that there is no standard for stylus to mounting lug distance on phono cartridges, had they got that right back in the ‘50s there would be no need to align anything at all, it would just bolt into exactly the right place! I have no idea why they thought CD needed to be so much louder than preceding source components, but here we are!
 
2V (+8dBm) has long been broadcast / studio peak level for all equipment, albeit nominally at 600 ohms. It also makes a lot more sense for DAC based equipment since there is no point in having theoretical dynamic range of 96 dB if half of it is lost in analogue noise.

I suspect that older valve amplifiers with high sensitivity such as the Leak and Mullard designs were made that way primarily because of the way the valve line-up worked and also because it suited the typical output levels of crystal pickups and AM valve tuners.
 
This might be a silly question, but never mind, here goes...

Suppose I have an amplifier with modest output power. It has an input sensitivity of 200mV. With a 2V signal it has difficulty driving my speakers to a satisfactory level.

If I give it a 4V signal, will it do better? If yes, are there any drawbacks to doing this?

I have been trying to get a satisfactory answer to this question (which seems a reasonable one to a person with no knowledge of electronics) for a while...
 
This might be a silly question, but never mind, here goes...

Suppose I have an amplifier with modest output power. It has an input sensitivity of 200mV. With a 2V signal it has difficulty driving my speakers to a satisfactory level.

If I give it a 4V signal, will it do better? If yes, are there any drawbacks to doing this?

I have been trying to get a satisfactory answer to this question (which seems a reasonable one to a person with no knowledge of electronics) for a while...

"Maybe" :)

It depends. Maximum power is maximum power. If you have reached it, all that increasing the input signal gets you is more distortion. If the amp really has an input sensitivity of 200 mV, and you drive it with a 2 V signal, you have already reached that point, and your amp is simply not powerful enough for your speakers.
 
Thank you!

It is a pity that available media do not standardise levels. If every broadcast, CD or downloadable recording had a maximum level of 0dB (or whatever), setting volume at the amplifier would be simple and it could be largely left alone once chosen.

On the face of it, this doesn't seem too much to ask... yet the difference in levels between a live Radio 3 broadcast and almost any studio album of non-classical music is enormous.
 
Thank you!

It is a pity that available media do not standardise levels. If every broadcast, CD or downloadable recording had a maximum level of 0dB (or whatever), setting volume at the amplifier would be simple and it could be largely left alone once chosen.

On the face of it, this doesn't seem too much to ask... yet the difference in levels between a live Radio 3 broadcast and almost any studio album of non-classical music is enormous.

That's because a R3 broadcast (particularly via the net) doesn't have the level compression which gets lathered onto a lot of other source material. R3 give you the real dynamic range of the source they are conveying, so far as humanly possible.

So argue with the makers of CDs and digital files that are level compressed to the max - often degrading the sound as a result!

Note that transient peaks in real acoustic music are often 5 to 10 times larger than the rest of the music. Thats a big difference, so requires headroom if you don't want peaks to be distorted.
 
That's because a R3 broadcast (particularly via the net) doesn't have the level compression which gets lathered onto a lot of other source material. R3 give you the real dynamic range of the source they are conveying, so far as humanly possible.

I wish there was someway of controlling the announcers's volume relative to the music. If I set the internet feed to a reasonable level for the music, the announcers are much too loud. I realise why it's done that way, but it would be nice to be able to control it.
 
This might be a silly question, but never mind, here goes...

Suppose I have an amplifier with modest output power. It has an input sensitivity of 200mV. With a 2V signal it has difficulty driving my speakers to a satisfactory level.

If I give it a 4V signal, will it do better? If yes, are there any drawbacks to doing this?

I have been trying to get a satisfactory answer to this question (which seems a reasonable one to a person with no knowledge of electronics) for a while...

Power output would be irrelevant. The stated sensitivity of a power amp is that input that will just drive it to full output. Hence if 100W and and 200mV sensitivity then 200mV in gives 100W out. 210mV in and it is starting to clip.

2V is the very maximum output from a CD or most other sources these days as they have gradually standardised on this due to CD being the "standard" source since 1983 when it came out... So phono stages, tuners, tape decks made much after say 1985 ish are generally designed to have similar output ie 2V so that they all have similar loudness for a given vol control setting.

2V makes reasonable sense as it is quite a high level for line level and hence gear designed for 2V out and a reasonably high level in will be rather more immune to noise and interference etc than one with 100mV sensitivity.

Oh and in case it was about to be anyone's next question... amps are generally more sensitive than 2V to allow you to reach max power even on very quiet recordings. 775mV was always a very common sensitivity for historical reasons ie that it is 1mW power into 600 Ohm which was "0VU" back when this power into that impedance was the standard in telephony applications. This standard kind of migrated into studio use etc.

2V ish output and 1V ish sensitivity in power amps means that a passive can easily be used and will be able to drive the power amp to full volume. Many integrated are in effect a power amp with passive pre and source selector of course...

"Flyers" such as the Quad II with its 3V sensitivity will need an active pre amp with gain to amplify the max 2V up to 3V obviously.
 
Oh and in case it was about to be anyone's next question... amps are generally more sensitive than 2V to allow you to reach max power even on very quiet recordings. 775mV was always a very common sensitivity for historical reasons ie that it is 1mW power into 600 Ohm which was "0VU" back when this power into that impedance was the standard in telephony applications. This standard kind of migrated into studio use etc.

I hesitate to take issue with you Jez but 1mW is 0dBm and into 600 ohms is 775mV but it is not 0VU. The Bell spec for 0VU is +4dBm (which means a steady tone at +8 is actually off the scale of a VU meter, this doesn`t matter with programme material which shouldn`t be peaking into the red anyway).

In most modern situations specifying level in dBm is incorrect as the circuit is rarely a proper 600 ohm line, dBu is correct as it infers voltage and not power.

Just to add a little more confusion 0dBm is 4 on a BBC spec Peak Programme Meter with +8 being 6, a much more sensible scheme in my opinion.
 
Hi,
I would like to add some confusion to this.

I recall and confirmed on the internet, that the 2volts output from a CD player is the RMS value, so the peak voltage is 2.83volts.

On the sensitivity - i checked an amplifier from Marantz :
https://www.marantz.co.uk/DocumentMaster/UK/Marantz_PM8006_Owners_Manual.pdf

Page 41 provides the specifications as follows :

Input Sesnsitivty :
CD, TUNER, NETWORK, AUX, RECORDER : 220 mV / 20 kΩ/kohm
POWER AMP DIRECT IN : 1.6 V / 15 kΩ/kohms

S/N Ratio
CD, TUNER, NETWORK, AUX, RECORDER : 106 dB (2 V input, Rated output)
POWER AMP DIRECT IN : 125 dB (Rated output)

So, we have the output of the amplifier being 100watts for a 2volt input as per S/N section, but stated sensitivity is 220mV. The manual does not state if they are RMS values or peak, although i would anticipate that they are RMS values. Also amplifier "direct in" has 1.6volts for rated output.

The difference between 220mV and 2volts is probably due to the position of the volume control, although this is not stated. The "direct in" probably has less components between the input and the power amplifier section, hence the reduced signal level to 1.6volts and higher S/N=125dB.

I would expect the voltage levels at the input will have nil effect on the preamp circuitry, but with the volume control at its maximum, will overdrive the amplifier.

Regards,
Shadders.
 
I hesitate to take issue with you Jez but 1mW is 0dBm and into 600 ohms is 775mV but it is not 0VU. The Bell spec for 0VU is +4dBm (which means a steady tone at +8 is actually off the scale of a VU meter, this doesn`t matter with programme material which shouldn`t be peaking into the red anyway).

In most modern situations specifying level in dBm is incorrect as the circuit is rarely a proper 600 ohm line, dBu is correct as it infers voltage and not power.

Just to add a little more confusion 0dBm is 4 on a BBC spec Peak Programme Meter with +8 being 6, a much more sensible scheme in my opinion.

I know hence it was in quotation marks... I thought it less confusing for the non technical... But I'll change it in another thread where the same matter just came up!
 
Hi,
I would like to add some confusion to this.

I recall and confirmed on the internet, that the 2volts output from a CD player is the RMS value, so the peak voltage is 2.83volts.

On the sensitivity - i checked an amplifier from Marantz :
https://www.marantz.co.uk/DocumentMaster/UK/Marantz_PM8006_Owners_Manual.pdf

Page 41 provides the specifications as follows :

Input Sesnsitivty :
CD, TUNER, NETWORK, AUX, RECORDER : 220 mV / 20 kΩ/kohm
POWER AMP DIRECT IN : 1.6 V / 15 kΩ/kohms

S/N Ratio
CD, TUNER, NETWORK, AUX, RECORDER : 106 dB (2 V input, Rated output)
POWER AMP DIRECT IN : 125 dB (Rated output)

So, we have the output of the amplifier being 100watts for a 2volt input as per S/N section, but stated sensitivity is 220mV. The manual does not state if they are RMS values or peak, although i would anticipate that they are RMS values. Also amplifier "direct in" has 1.6volts for rated output.

The difference between 220mV and 2volts is probably due to the position of the volume control, although this is not stated. The "direct in" probably has less components between the input and the power amplifier section, hence the reduced signal level to 1.6volts and higher S/N=125dB.

I would expect the voltage levels at the input will have nil effect on the preamp circuitry, but with the volume control at its maximum, will overdrive the amplifier.

Regards,
Shadders.

All irrelevant. Mainly marketing lies going on there.... "twisting" how things are described to make the specs look as good as possible.
Obviously there is an active pre amp section here hence 220mV whereas power amp sensitivity 1.6V.
 
I wish there was someway of controlling the announcers's volume relative to the music. If I set the internet feed to a reasonable level for the music, the announcers are much too loud. I realise why it's done that way, but it would be nice to be able to control it.

Echoed! Yes, I have trying asking the BBC where they get announcers with voices as loud as a full orchestra. Can be a real annoyance.
 
I wish there was someway of controlling the announcers's volume relative to the music. If I set the internet feed to a reasonable level for the music, the announcers are much too loud. I realise why it's done that way, but it would be nice to be able to control it.
Ah! I wondered if it was only me reaching for the volume control during announcements on the Radio 3 digital stream.

But I must say that my last change of loudspeakers plus amplifier did, surprisingly, ameliorate the effect quite a lot, to the extent that the announcers are still a bit loud but no longer annoying. I'm not sure I could explain quite why.
 
I have found that the use of in-line attenuators (-10dB, Rothwell jobbies) not only made more of the volume control's arc usable, as you'd expect, but also removed a harsh, edgy feel to one particular amplifier. It certainly sounded like the amp was distorting in some respect, on a 2V CD input, despite the input sensitivity of the amp being pretty typical and around 150-200 mV (IIRC). Any thoughts as to what might be going on here?
 


advertisement


Back
Top