advertisement


75 ohm cables and Audio interconnects - what's the difference?

neilmack

pfm Member
The cables sold for digital applications commonly quote an impedance, yet interconnects for audio-band use never do. So how do they differ, and can a 75ohm cable be used as an interconnect in an analogue setup? Or will it do terrible things to the signal?

Many thanks
Neil
 
I use 75 ohm single ended and even have the cheek to used 110 ohm aes /ebu balanced cables for analogue. Just how naughty is that?
 
The cables sold for digital applications commonly quote an impedance, yet interconnects for audio-band use never do. So how do they differ, and can a 75ohm cable be used as an interconnect in an analogue setup? Or will it do terrible things to the signal?

Many thanks
Neil

Cable impedance is only relevant at high frequencies where the cable length becomes comparable with the wavelength of the signal. In the case of digital audio, the frequency of the stream is in excess of 1.4MHz, so RF conditions apply if the cable length gets over a couple of hundred metres. Nevertheless, it is good practice that for digital audio, the cable be 75 ohms impedance, as that's the sending and terminating impedance.

For analogue audio, impedance is completely irrelevant until cable lengths get in the many hundreds of metres. Originally, professional audio used 600 ohms as a standard impedance, as this was taken from telephony, which sent analogue signal over hundreds of kilometers, so impedance was important. For domestic audio use, or even professional use these days, cable impedance is irrelevant.

75 ohms unbalanced cables can be used equally for digital or analogue audio, as can 110ohm balanced cables. In fact, it is sensible for all cables to be either 75 ohms or 110 ohms, as then they can be used equally for analogue or digital, over any distance.

S.
 
Yes you can use a 75 ohm interconnect for normal audio, they are usually well-made. Standard phono leads are more like 40-50 ohms, typically. Characteristic impedance (determined by physical construction) is irrelevant at audio frequencies unless you want to use miles of cable, but digital is at RF (MHz) where it is important.
 
the frequency of the stream is in excess of 1.4MHz, so RF conditions apply if the cable length gets over a couple of hundred metres
The 'bit' rate in a 44100Hz S/PDIF channel is 2.8MHz. So the bandwidth to make nice square edges is significantly higher. If you want it to work at 176400 or 192000 then multiply accordingly.

Paul
 
Strangely enough, when I tried using an old QED interconnect I had kicking about as a 'stand-in' digital interconnect, I found real problems with the sound. Moving to a dedicated (but inexpensive) digital cable made a big difference.

Shouldn't have, but did. I'm still fairly cable skeptic these days, though I have heard differences in certain cables, most notably things like the Anticables and the NVA cables. Still reckon that worrying about cables before the room is sorted (which mine isn't, tbh) is like putting an elastoplast on a broken leg... not that my room issues are that serious mind you...
 
Thanks to the OP and respondents: I've always wondered about this (well, not always, only the last 10 years, really)! I once picked a bin-end pair of RCA to RCAs at Maplins for a fiver, gold plugs da-di-da, and noticed that the cables had '75 ohms Digital' stamped on them: now I know why (they don't sound any better or worse than any other cheap cables I own).
 


advertisement


Back
Top