advertisement


Wear and tear on amplifiers

Servers, routers etc have been designed to be constantly on and have much shorter lifetime expectations than a hifi amplifier. It's not a valid comparison.
Semiconductors and valves have lifetimes based on use and environment. For semis, higher temps then shorter lifetimes, higher voltages then shorter life. Switching on and off within the stated operating parameters is not one of them.

It's disconcerting to hear that you don't give two hoots for the environment. It's not us that will have to live with the poor choices made now, what about future generations?
Switch off what you can and when you can.
 
1418136-e7b82ebe-audiolab-8200cdq-cddacpreamp.jpg
The audiolab 8200cdq is badly layed out in respect with caps getting very hot. Front right there is a large bank of caps with heatsinks of componants less than 1mm from them. They get very hot, so I dont have the top of the case on and have made a simple slow fan from a pc to keep that area cool.
 
The audiolab 8200cdq is badly layed out in respect with caps getting very hot. Front right there is a large bank of caps with heatsinks of componants less than 1mm from them. They get very hot, so I dont have the top of the case on and have made a simple slow fan from a pc to keep that area cool.

No kids or pets then or did you have them?
 
Having worked for a company supplying industry and Government with thousands of computer equipment the last think you want to do is switch stuff off.

Server farms have many thousands of ....well.....server computers, switches, routers etc. no one would dream of switching anything off, exactly the opposite.

For equipment of this nature that is required for round the clock service I agree entirely. It also needs to be pointed out that server rooms are temperature controlled/actively cooled, and the typical active lifespan of a server is only three years or so.

Entirely different rules apply to hi-fi which in most cases is only used for an hour or two a day. In this case it is massively more logical to turn it off as capacitors etc last so much longer that way compared to burning them 24/7 when all you need is maybe 6-8 hours use per week. It is the difference between even really well made, cool running and well specified kit lasting around a decade between rebuilds and maybe 40 years.
 
Can you increase airflow around them when in use? What temps do the cases get up to in summer and in use?

They are well ventilated sitting on a Quadraspire AV rack, 200mm from the shelf above them and plenty of room all around. I have only had them a few weeks so the summer is a unknown.
 
I have a pair of class A mono blocks which obviously generate quite a bit of heat which has me wondering, which mode of operation would have the least wear and tear in the long run, turning them on and off two-three times a day/eve or leaving them on all day and evening?
Everything is turned off at the end of the day so this is just about best practice for reliability.
Thanks
I would agree with a suggestion of setting up an alternative system if you mainly play background music since you are leaving your Class A monoblocks on all day long which may come up to 12 to 15 hours a day. Class A amps emit quite a fair bit of heat in comparison to Class AB so it may be prudent to switch the amps off if you are not doing any critical listening.

FWIW I also own a Luxman Class A amp and it surely runs warmer than warm. It's hot to the touch when in operation and certainly produce a lot more heat than my Class AB amps which include Naim and Sonneteer which run almost cool. I used to leave my Luxman Class A amp on the whole day (approx 15 hours) when I am at home as it's nice to have background music playing at low volume. When I'm at home the whole day, I usually listen to instrumental piano music and the hifi will be playing non-stop for more than 10 hours, but not anymore. After some time, I begin to come to my senses and question myself if it's doing the amp and utility bills any good. I'm more concerned on the long-term effects on the amp as I really love the Class A Luxman and want it to last as long as it could before something inside it blows up, or it requires as service.

So long story short, I set up a 2nd system in the same room at a different location for casual listening. The main system based on the Class A amp is reserved for critical listening. Even though a Class AB amp is used in the 2nd small system, I switch the amp off when I'm done with the listening. Gone are the days when I used to leave my Naim amps switched on 24/7 for years (Oh well, that's a bit of exaggeration as the amps may have been switched off once or twice a year). The Naim amps sound better if permanently left on but will just leave that for another day.

If it's a Class AB amp that runs cool, I wouldn't be too critical about leaving the amp on the whole day though.
 
I always switch off when I’m done listening, even after and between short listening sessions.
Why would anyone leave their hi-fi on anyway?
It doesn’t make any sense to me, electrically.
Don’t get me started with the usual, “Well it does sound better after a half hour warm-up” excuse.
The only times I leave hi-fi bits on for hours and days are after a repair (nothing to do with running-in, another myth). But I don’t listen to them!
And expensive power valves are excluded from that long testing process.


Don’t start me on this green crap, get a life, the planets not ours, we are just annoying flees witnessing the planets evolution.

Really?
 
I have a pair of class A mono blocks which obviously generate quite a bit of heat which has me wondering, which mode of operation would have the least wear and tear in the long run, turning them on and off two-three times a day/eve or leaving them on all day and evening?
Everything is turned off at the end of the day so this is just about best practice for reliability.
Thanks

Can’t comment on the wear tear implications of turning on and off throughout the day.

If you want to use it then use it, turn off when not in use.

About 20 years ago, when I was going through my Russ Andrews phase, there was a recommendation to leave kit on 24/7 to get the best sound. I think there was also advice to put some kit in the fridge or something equally crazy....

I had active speakers which ran in part class A, so i used to waste electricity routinely, plus felt more sweaty than normal in the summer months with die cast aluminium cabinets which acted like radiators.

Not sure what happened, perhaps just getting old but I now feel very guilty for leaving my kit on if I am not using it, like light bulbs, so I always switch off. I also historically had the fear that power cycling had to be doing harm, so the less of that the better.
 
My Mark Levinson 380S preamp manual says leave on permanently, the No333 power amp consumes 350w in idle and 200w in standby-what is being powered in this mode, I wonder?
 
Rebuilding my valve amp recently it was great to have a choice of 105c 7000hour rated electrolytic capacitors. Solar panel and data centre usage was the rational mentioned in the components application notes. Anyone know the life of electrolytic capacitors unloaded at room temperature?
 
Servers, routers etc have been designed to be constantly on and have much shorter lifetime expectations than a hifi amplifier. It's not a valid comparison.
Semiconductors and valves have lifetimes based on use and environment. For semis, higher temps then shorter lifetimes, higher voltages then shorter life. Switching on and off within the stated operating parameters is not one of them.

It's disconcerting to hear that you don't give two hoots for the environment. It's not us that will have to live with the poor choices made now, what about future generations?
Switch off what you can and when you can.

Whilst you can mitigate the lifetime of servers etc there are complex interactions between components and environments.

There are many studies on the on off/dormancy profiles generally provided at the military’s request, they make interesting reading and do not depend on hearsay. They is very much focus on the reality of on/off ...dormancy performance regardless of component manufacturers data!

Humanity continues to desecrate this planet knowing what they’re doing...money talks, while I try not to participate as much as my morality determines. I am not bound by stupid rules!

Right now look at the environmental costs of masks, testing kits, PCR testing, building super sized testing laboratories, where’s the disposal point? If you’re wearing a mask and testing yourselves every two days as “ The Science” says WOW are you helping the environment.

Gary
 
For equipment of this nature that is required for round the clock service I agree entirely. It also needs to be pointed out that server rooms are temperature controlled/actively cooled, and the typical active lifespan of a server is only three years or so.

Entirely different rules apply to hi-fi which in most cases is only used for an hour or two a day. In this case it is massively more logical to turn it off as capacitors etc last so much longer that way compared to burning them 24/7 when all you need is maybe 6-8 hours use per week. It is the difference between even really well made, cool running and well specified kit lasting around a decade between rebuilds and maybe 40 years.


I love the assumption that all server rooms are temperature controlled/actively cooled. In theory a requirement, in reality not so much. I have been in many hundreds of server rooms around the globe and the environment and 3 year lifespan are a joke.

Server Farms are not to be confused with server rooms.

Please see post #33 for comments on dormancy, on off switching.

Gary
 
There are many studies on the on off/dormancy profiles generally provided at the military’s request, they make interesting reading and do not depend on hearsay. They is very much focus on the reality of on/off ...dormancy performance regardless of component manufacturers data!

Profiling what parameters, lifetimes? Is this servers again?
 
Not servers, component failure rate including but not limited to capacitors.

Useful if I take my amps into battle with me.

I worked for 10yrs in a large US Defense Manufacturer so I've seen a lot of this stuff, including the bits coming back as claims and it's just not applicable to a hifi amplifier.
I'd also add that the data was poor because many of the designs are poor and particularly from a Design for Manufacture POV which builds in failures because they're just nuts to build. If we got a board through to final acceptance without a single piece of rework then it would be a miracle. Returns weren't generally a thing when most of the time these things just sit in a warehouse for years and then get dropped onto the enemy.
 
I love the assumption that all server rooms are temperature controlled/actively cooled. In theory a requirement, in reality not so much. I have been in many hundreds of server rooms around the globe and the environment and 3 year lifespan are a joke.

Most corporate ones I’ve worked in were, though I’m used to working with pretty serious kit (e.g. AS/400s etc as well as server racks). I’ve been “retired” for 20 years now so things may well have changed a fair bit over that period, but I bet any fecent corporate IT facility is in a dedicated server room still. The lifetime was a generalisation speaking as an ex-IT manager, I always used to factor TCO over a three-year depreciation cycle, i.e. anything beyond that point and the item was effectively ‘free’. There were always many older machines knocking around, in fact I’ve contracted in several places where absolutely mission critical stuff was still running on old DOS IBM 5150s with 5 1/4” floppies right into the late Pentium era! Almost always the software was so dependent on specific clunky old interface cards etc it just couldn’t be migrated to more modern Windows or OS/2 machines!

PS Moore’s Law was far more acute 20-30 years ago, these days computers are still useful performance wise a decade or so later. They weren’t back then, e.g. a 200mHz Pentium was a very different machine to say a 286 from a decade previous.
 
Most corporate ones I’ve worked in were, though I’m used to working with pretty serious kit (e.g. AS/400s etc as well as server racks). I’ve been “retired” for 20 years now so things may well have changed a fair bit over that period, but I bet any fecent corporate IT facility is in a dedicated server room still. The lifetime was a generalisation speaking as an ex-IT manager, I always used to factor TCO over a three-year depreciation cycle, i.e. anything beyond that point and the item was effectively ‘free’. There were always many older machines knocking around, in fact I’ve contracted in several places where absolutely mission critical stuff was still running on old DOS IBM 5150s with 5 1/4” floppies right into the late Pentium era! Almost always the software was so dependent on specific clunky old interface cards etc it just couldn’t be migrated to more modern Windows or OS/2 machines!

PS Moore’s Law was far more acute 20-30 years ago, these days computers are still useful performance wise a decade or so later. They weren’t back then, e.g. a 200mHz Pentium was a very different machine to say a 286 from a decade previous.

You would be surprised many enterprises still use very old equipment, for the reasons you state. Banks tend to be the worst historically, as they refuse to pay the cost of model office.

As you know Moore’s Law was not a law but a perception/observation we are still progressing but at a slower rate.

Gary
 
Most corporate ones I’ve worked in were, though I’m used to working with pretty serious kit (e.g. AS/400s etc as well as server racks). I’ve been “retired” for 20 years now so things may well have changed a fair bit over that period, but I bet any fecent corporate IT facility is in a dedicated server room still. The lifetime was a generalisation speaking as an ex-IT manager, I always used to factor TCO over a three-year depreciation cycle, i.e. anything beyond that point and the item was effectively ‘free’. There were always many older machines knocking around, in fact I’ve contracted in several places where absolutely mission critical stuff was still running on old DOS IBM 5150s with 5 1/4” floppies right into the late Pentium era! Almost always the software was so dependent on specific clunky old interface cards etc it just couldn’t be migrated to more modern Windows or OS/2 machines!

PS Moore’s Law was far more acute 20-30 years ago, these days computers are still useful performance wise a decade or so later. They weren’t back then, e.g. a 200mHz Pentium was a very different machine to say a 286 from a decade previous.

You would be surprised many enterprises still use very old equipment, for the reasons you state. Banks tend to be the worst historically, as they refuse to pay the cost of model office.

As you know Moore’s Law was not a law but a perception/observation

Useful if I take my amps into battle with me.

I worked for 10yrs in a large US Defense Manufacturer so I've seen a lot of this stuff, including the bits coming back as claims and it's just not applicable to a hifi amplifier.
I'd also add that the data was poor because many of the designs are poor and particularly from a Design for Manufacture POV which builds in failures because they're just nuts to build. If we got a board through to final acceptance without a single piece of rework then it would be a miracle. Returns weren't generally a thing when most of the time these things just sit in a warehouse for years and then get dropped onto the enemy.

Womack, Jones and Roos.
 


advertisement


Back
Top