advertisement


Naim amps, bias-current setting and distortion

martin clark

pinko bodger
[size=-1]note: this is a re-post from this thread. The full text, and any future re-visit/edits, will maintained at my website here[/size]

Spurred-on by this thread here's a quick investigation of the link between the quiescent bias-current setting and linearity, for the Naim quasi-complementary power amp design.

As I posted in that thread, 'measuring the total current draw of the amp is actually a pretty imprecise way of setting bias - for which the only thing matters is the current at the output transistor and specifically the voltage across the emitter resistors - Doug Self has published some pretty exhaustive analysis of this for many different output configurations. If you change the operating voltages, the PSU 'stiffness', regulation of the front end or better definition of its operating current, the temperature at which all the BJTs operate... you change the balance of the total amp current draw to the output stage in untested ways, probably quite radically. Naim amps do not have thermal compensation for the bias setting, which is why it's critical to 1) set them warm and 2) not to run too high a bias - they can run away thermally. So it's entirely possible that with a heavily modified amp, a different bias set by total current draw is correct (from the usual advice of 36-38mA total on the positive rail) - because that's what it will take to get the particular output stage in its sweet spot.'

Bringing these two thoughts together - the right place to measure, and therefore set bias, is via the voltage appearing across the 0R22 emitter resistors.

A test case
The Nait 2 uses the same basic architecture as all Naim power amps up to the 250.2, and it' s the only one I have to hand, and it's got a handy level control bolted on the front, so here goes... Here's a picture of the setup:

rig.jpg


I burnt a 1hr track of 1Khz, 0dBfs sinewave to CD using a utility on the ancient iMac. So, we have a source potentially capable of at least -93dB distortion (0.0022%). This was read by my long-suffering CD2, and the signal input of the Nait2. At the Nait's output, a dummy load was plugged in (2 x 10W, 12R metal film resistors in parallel for a 6ohm, 20W dummy load):

load_test.jpg


Attached to this load is a resistive 100:1 attenuator (in the white wires) that in turn feeds the input of the craptop's soundcard. This input is analysed using SpectraRTA for real-time %THD analysis (and a few other things I'll write about some day). The large DVM is temporarily soldered to the Nait to set/monitor DC mV across an emitter resistor. So - full-scale signal into the Nait, volume control used to set power output, and the Nait's balance control was used to null-out channel differences to within 5mV so that the second DVM plugged into the other channel monitored Vrms across the load. This was arbitrarily set to 1W into 6ohms (2.450VAC rms) at 1Khz. Also note the glass of cheeky young Rioja. Room temp was within 0.5deg C all evening, and the transformer-powered halogen and CRT iMac turned off during testing to avoid stray fields as far as possible.

Test regime: with amp warmed for 2hours (pink noise at 5W/10mins, 1W for the rest) I took an initial reading as-was to see what happened (6.9mV; 00171%THD, at 1 watt into 6ohms). So the regime was simple, if tedious: adjust Iq measuring mV across an emitter resistor; allow to stabilise 1min, readjust if required, then run CD input taking the THD% figure after 30secs (slowest averaging used, last sig fig discarded). Pause track, Iq re-checked, then reset for next test (pot swung back to centre of test value) The bias figure was also checked 5-10secs after the test track was stopped to ensure nothing too radical had gone wrong. It was quite noticeable that %THD drifted after10secs or so under load, indicating bias drift as the amplifier's temperature changed - though it could not be sensed on the heatsink - and this was reflected in inflated mV readings that drifted down a few tenths after stopping the test input. Also note that shifts of 0.2mV could be easily and immediately arranged by blowing gently on the circuit! After pausing the test track it would take 10-30secs for the bias to settle again, however 3mins were allowed between iterations to ensure DC values. Waiting for these 'thermal tails' was the most boring bit of the testing.

This is where the Rioja came in.

Total distortion levels averaged around -76 to -80dB at 1 watt, pretty respectable overall given it's the sum total of 1) my home-bodged CD2, 2) the Nait and 3) A-D conversion on a crap laptop 4) manky bits of wire and the hasty lash-up bolting these bits of kit together. In any case the numbers should be viewed as relative NOT absolute. At the end of the session - and 3hrs after I started - with the bias reset where it was to begin with, the amp measured the same as when I started, so my errors appear to be monotonic at least...

To cut to the chase:

nait_distortion.gif


Here's what happens when you test %THD against the quiescent bias setting, measured in mV across one 0R22 emitter resistor. The curve stops at 8.5ma because experimentation significantly beyond here brought no substantive change (except that I had to open another bottle). So why should we care? Well apart from anything else the curve shows:

1) The Nait2 is a low-distortion design, whatever people might think...
2) The curve is very flat - within limits. Ensuring at least 4.2- 4.5mV across the emittor resistor gets you in the right ballpark. A lotmore doesn't help much, so don't waste the power.
3) Absolute minimum occured in the range 7.0-7.5mV - for this amplifier.
4) Underbias has a far more detrimental effect than over-bias - as we should expect for a Class B amplifier design (but drastic overbias is a topic for another day)
5) If you squint hard enough, there appear to be two minima, or at least a very small increase in distortion around 6- 6.5mV. Why? well perhaps the two halves of the output stage actually want different bias currents for minimum composite distortion... more things to play with another time.

So - based on a sample of one(!) - the recommendation is simple: with the amp cold, make sure your Naim's quasi-comp output stage displays at least 4.0mV across a 0R22 emitter resistor that is, 7.2 to 8.9mV total across BOTH resistors - which is a more accurate way to measure; but 1) just don't worry if it's within 2 or 4mV more and 2) don't waste more bias current in the hope of lower distortion. That's all!


Afterthought: certainly don't turn the wick right up in the hope of 'Class A' operation. No, really, don't do it.

Well, since you asked...in the interest of research... and with a little adjustment (ahem) my Nait was temporarily fudged to just squeeze the necessary 90mV of bias to run the test entirely in Class A. Result: 0.0135% THD (whereas just 7.0mV of bias resulted on 0.0165% THD) And at this level the bias was galloping away while I watched. The bias setting represents nearly 400mA of standing current, and the meter drifted well past the 500mA mark in under 15secs of the very abbreviated test run. Left unchecked the amp would have blown in short order. Now look at the 'gain': a difference less than 0.003% (< -90dB difference) but with the certainty of killing the amp. So don't believe anyone that says more bias is better.... at least without checking all the possible consequences in detail.
 


advertisement


Back
Top