advertisement


The truth about bit depth in digital

You can listen to the effects of jitter here:

The problem with that testing is as he says himself there are no DACs that have jitter issues as bad as 2uS. In fact DAC master clocks are generally better than 100pS - which is 20,000 times more accurate than the smallest error he's using. Makes the test meaningless to be honest as you'll never see the kind of jitter he is trying to convey in any DAC.
 
The problem with that testing is as he says himself there are no DACs that have jitter issues as bad as 2uS. In fact DAC master clocks are generally better than 100pS - which is 20,000 times more accurate than the smallest error he's using. Makes the test meaningless to be honest as you'll never see the kind of jitter he is trying to convey in any DAC.

It trains you to identify jitter. That makes it useful.
 
  • Like
Reactions: gez
Most early (and not so early) digital recordings were converted to analogue for correction and mastering and back again to digital because digital editing was so inferior. I'm not sure how well-known that was at the time.

Yes. Which baked in any 'features' like imperfect monoticity of the 'steps' between digital values in the source material. And means *two* clocks with no fixed relationship got to phase modulate that as well.
 
Yes. Which baked in any 'features' like imperfect monoticity of the 'steps' between digital values in the source material. And means *two* clocks with no fixed relationship got to phase modulate that as well.
Just like in the 50s though when they could make wonderful recordings direct to tape I suppose there were good (maybe great) sounding, simple, 'one step' digital recordings of the same sort of nature, where no mixing was required.
 


advertisement


Back
Top