advertisement


What does bit perfect mean?

^This! And then this nonsense gets spread around forums and becomes Truth. No wonder so many audiophiles these days are hopelessly lost in buying expensive cables and other assorted foo.
It was making a different suggestion - that the computation used to decompress the audio stream was somehow audible, so was affecting the ability of the computer to ship the decoded frames to the DAC. This is also untrue. Computers don't work like that, it's a misunderstanding of what is going on.
 
It was making a different suggestion - that the computation used to decompress the audio stream was somehow audible, so was affecting the ability of the computer to ship the decoded frames to the DAC.

That's not what is being suggested.
Some people claim that DSP generates more noise than idle, and that the noise amplitude increases with load. Also that it's not just a raising of the noise floor but periodic spikes at certain frequencies, a bit like the USB 8KHz packet noise.
The datastream will arrive unaffected to the DAC but the ground plane noise will affect the clock and the D/A conversion.
 
It was making a different suggestion - that the computation used to decompress the audio stream was somehow audible, so was affecting the ability of the computer to ship the decoded frames to the DAC. This is also untrue. Computers don't work like that, it's a misunderstanding of what is going on.
Well, exactly.

Whether you're using a streamer, "network bridge", a PC, or whatever, you're using a computer that's running a general purpose operating system (Linux in almost every case for standalone devices). This general purpose OS will be doing many tasks in parallel, in addition to throwing bits out at your DAC, so to claim that the small amount of computation involved in decoding FLAC results in some audible effect, is just nonsensical.

If we're going to make things up, how about this:

You shouldn't use WAV files because the long strings of 0s and 1s in the uncompressed data mean that the network interface's clock recovery gets stressed, which results in "a layer of digital hash".
 
That's not what is being suggested.
Some people claim that DSP generates more noise than idle, and that the noise amplitude increases with load. Also that it's not just a raising of the noise floor but periodic spikes at certain frequencies, a bit like the USB 8KHz packet noise.
The datastream will arrive unaffected to the DAC but the ground plane noise will affect the clock and the D/A conversion.

I note the use of "claim" and also using "will" as "might" or assuming *any* 'change' no matter how tiny and beyond the audio range might be audible.

Many claims/ideas 'might' be true. Not all of them are. And at some point added 'noise' might be well below the random hiss in your ears due to the quantisation of 'air'. For that I tend to have a cup of tea and enjoy the music. However I don't use a Victorian photographer's head clamp to hold my head rigidly in place, so may suffer from my head moving about all the time as it is battered by the air molecules and adding more noise. 8-]
 
I note the use of "claim" and also using "will" as "might" or assuming *any* 'change' no matter how tiny and beyond the audio range might be audible.

This noise is definitely not audible because it's way above the audible range.
It's effects on the clock and/or D/A conversion may be audible though.

I also tend to enjoy the music mostly because I hardly ever play something I don't like.
 
This is utter nonsense. Lossless compression is exactly, mathematically that: lossless. The idea that uncompressing (say). FLAC causes "noise" is just fantasy.
I have at length listened to files ripped in WAV, FLAC, AIFF and if you want to hear the best audio performance then you need to use WAV. I have also asked a number of the top audio companies (and their designers) making the best music servers and they all say, use WAV if you want to achieve the best audio performance. Noise from a number of sources is a big problem in digital audio. The lower you can reduce this noise or reduce its effects the better audio performance you will get. A number of companies are separating out the computer functions, so you have a computer optimised for the server function and another computer optimised for the player function. Some companies have even separated these two functions into separate cases, each with their own power supplies and heavy shielding. This is only necessary if you are trying to get the ultimate performance from your stored music. Yes, mathematically lossless compression is supposed to be "lossless", but audibly the lossless files are not as good sonically as the WAV files. I have WAV files and lossless files (same album) and even in my cars you can easily tell the WAV files give the superior audio performance.
 
I have at length listened to files ripped in WAV, FLAC, AIFF and if you want to hear the best audio performance then you need to use WAV. I have also asked a number of the top audio companies (and their designers) making the best music servers and they all say, use WAV if you want to achieve the best audio performance. Noise from a number of sources is a big problem in digital audio. The lower you can reduce this noise or reduce its effects the better audio performance you will get. A number of companies are separating out the computer functions, so you have a computer optimised for the server function and another computer optimised for the player function. Some companies have even separated these two functions into separate cases, each with their own power supplies and heavy shielding. This is only necessary if you are trying to get the ultimate performance from your stored music. Yes, mathematically lossless compression is supposed to be "lossless", but audibly the lossless files are not as good sonically as the WAV files. I have WAV files and lossless files (same album) and even in my cars you can easily tell the WAV files give the superior audio performance.
Those would be the "top audio companies and designers" who make their living by inventing myths like this and selling needlessly expensive and overcomplicated kit to people who want to believe they are buying something special.

I mean honestly it's beyond laughable, but I'll stop there.
 
Bit-perfect is the ability of the playback application to pass the bitstream to the DAC without further alteration. This used to be a big issue with older Windows versions, as every application tended to go through kmixer. kmixer used a low-quality re-sampler on everything that noticeably degraded the sound. Windows Vista revamped the entire audio subsystem, thus fixing the audible degradation. Now, unless you need DSD over PCM or MQA support, I wouldn't worry too much, as most systems have moved to higher precision for their internal mixing.

Yes, mathematically lossless compression is supposed to be "lossless", but audibly the lossless files are not as good sonically as the WAV files. I have WAV files and lossless files (same album) and even in my cars you can easily tell the WAV files give the superior audio performance.
This is absolute garbage. It's super easy to do a bit of comparison on the data and find out it's decoded to the exact same data. Since WAV is also significantly larger, one could argue that the extra data needed to be moved off the disk would also make more noise than just decompressing the FLAC.
 
Those would be the "top audio companies and designers" who make their living by inventing myths like this and selling needlessly expensive and overcomplicated kit to people who want to believe they are buying something special.

I mean honestly it's beyond laughable, but I'll stop there.
Yes, I'll stop here too. I won't waste my time any more contributing here.
 
I've also listened to the same source material in various formats and find that your absolute assertion returns FALSE in my experience. Nor have test captures shown any sign of an audible effect.
Same here. When I started ripping my CDs I carefully compared wav flac alac and couldn’t detect a difference. I did consider a naim streamer as I liked the software for classical music. Their insistence that I would need to re rip flac files to wav put me off the idea. Why would anyone choose a format with inferior tagging and requiring larger drives when there is no advantage in sound quality?

The idea that one can hear differences in sound quality between wav and lossless in a car seems bizarre to me. Mind you, to be fair, I don’t own a Bentley or RR or the like - I couldn’t fit my electrostatic speakers in my car!
 
This noise is definitely not audible because it's way above the audible range.
It's effects on the clock and/or D/A conversion may be audible though.

That risks conflating two different things.

'DSP' may add 'noise' to the values of the series in the data stream. However this defined sequence noise is simply a part of the series of samples to be rendered. It isn't noise via some other route that 'bothers' the conversion process or its clock.
 
Bit-perfect is the ability of the playback application to pass the bitstream to the DAC without further alteration. This used to be a big issue with older Windows versions, as every application tended to go through kmixer. kmixer used a low-quality re-sampler on everything that noticeably degraded the sound.

The early systems used for digital audio streaming did tend to have quite big hidden "whoops!". One I recall affected the way the BBC used to stream because the chosen way users had to decode it forced the result to output 44k1 rate - when the BBC were sending 48k! They only noticed that when I found out from tests and told them! As a result for some years they used to send me test files, etc, and I had a dialogue with them chasing other potential hidden snags. Sadly, they've now largely fired all their audio engineers. As shows up in obvious ways like the sound levels varying wildly from programme to programme because the BBC just play out what the independent producers send in. (R3 largely is the exception here as they still take care. But level compress more during 'daytime'.)
 
The early systems used for digital audio streaming did tend to have quite big hidden "whoops!". One I recall affected the way the BBC used to stream because the chosen way users had to decode it forced the result to output 44k1 rate - when the BBC were sending 48k! ...
This, I regret, still happens when I (rather rarely) use BBC Sounds on my iPhone and send the BBC's 48k stream wirelessly to my system using Airplay. The DAC displays 44.1k. I assume the iPhone is doing a non-integer re-sample. How well it's done I don't know.

AIUI Airplay has the technical capability to transfer 48k audio but I think there is some rather opaque policy involved.
 
^This! And then this nonsense gets spread around forums and becomes Truth. No wonder so many audiophiles these days are hopelessly lost in buying expensive cables and other assorted foo.

I can't say I've ever heard the difference between WAV, FLAC etc but in blind listening tests the difference in cables is pretty easy to hear and once you start to understand how electrical signals are 'passed' down the wire (as in they are not passed at all, there is no 'flow' of electrons, it's more like a 'propogation'), it's easy to understand why the construction of a cable would make such a huge difference.
 
I can't say I've ever heard the difference between WAV, FLAC etc but in blind listening tests the difference in cables is pretty easy to hear and once you start to understand how electrical signals are 'passed' down the wire (as in they are not passed at all, there is no 'flow' of electrons, it's more like a 'propogation'), it's easy to understand why the construction of a cable would make such a huge difference.
Agree to disagree on that :)
 
I think although I have little actual knowledge that AirPlay 1 is fixed at 16/44. AirPlay 2 kinda ‘does what it wants’ but will use 320kbps AAC at 48khz.

Regarding if it’s in AirPlay 1 or 2 as far as I know if anything in the replay chain is AirPlay 1 it’ll default to that, if it’s all AirPlay 2 compatible it’ll run that (and I don’t think there’s any way of selecting).

The idea of AirPlay 2 was to reduce dropouts and buffering (again insofar as I’ve read online)

Do check that, if interested.
 
I can't say I've ever heard the difference between WAV, FLAC etc but in blind listening tests the difference in cables is pretty easy to hear and once you start to understand how electrical signals are 'passed' down the wire (as in they are not passed at all, there is no 'flow' of electrons, it's more like a 'propogation'), it's easy to understand why the construction of a cable would make such a huge difference.

The above wrt propagation s something I taught to countless undergrads over the decades. But that doesn't mean that it isn't also 'easy' to ensure any such difference should generally be trivially small in domestic audio cables. i.e.smaller than moving your head a mm.

That said, some amps have been sold which were poorly made in this respect. And would burst into ultrasonic oscillations if you used the 'wrong' cable and speakers. Despite engineers knowing how to prevent this by good design.

But all that tells us is that we need to get decently designed and made kit, etc. Not really quantum mech II.

OK, addendum to that being: Some engineers, etc, may well decide to choose cables or make amps that *deliberately* change the results audilbly and argue that sounds 'better'. e.g. high inductance and/or high resistance speaker cable can easily be arranged to create audible changes with various loudspeakers. Or in some cases due maybe to dimwittery by the enginer/salesman I guess.
 
Agree to disagree on that :)
I can respect that but I am genuinely interested to understand your position on this. As far as I can tell, there are three possible positions here:

1. All cables sound identical, irrespective of the cost and means of their construction; so a £2,000 interconnect will have no discernable difference in sound, good or bad, versus say a £20 one. They will sound completely the same.
2. There are differences between cables, but none of those differences are either objectively better or worse (again irrespective of the cost and means of construction), they are just different.
3. Cables can sound better or worse in a given system but there is little, if any correlation to how much those cables cost.

It's a sincere question.
 
  • Like
Reactions: x21
That risks conflating two different things.

'DSP' may add 'noise' to the values of the series in the data stream. However this defined sequence noise is simply a part of the series of samples to be rendered. It isn't noise via some other route that 'bothers' the conversion process or its clock.
No, this is ground plane noise travelling over copper, there's no change to the data.
You are very knowledgeable but should investigate this a bit further before dismissing.
 
  • Like
Reactions: x21


advertisement


Back
Top