Mike P
Trade: Pickwell Audio
I'm afraid I've covered this topic in the past but I'm going to drag it up again....
I've just repaired an old Denon DCD-3300, which a lovely TOTL vintage player. It uses the Burr Brown PCM56 (one per channel) and has the facility to adjust the MSB error via a trimmer.
I understand the basic principle would be to play a test tone sine wave from a CD whilst observing the output on a distortion analyser or spectrum analyser and and then adjust the trimmer for lowest distortion (distortion meter) or lowest harmonic 'spikes', especially 3rd and 5th order harmonics (S.A.).
From reading around it seems most people use a 1KHz frequency. I don't know if there's any particular significance to this, I suppose it's in the critical audio spectrum and happens to be a convenient round number.
Again, reading around it seems most people use a very low level signal for this, typically -60dB. However, the service manuals for the Sony CDP-X7esd and CDP-557esd, which have similar adjustment facilities states 1KHz @ 0dB. The Sony service manual calls for a distortion analyser and not a spectrum analyser. I attempted the Sony procedure a few years ago and bought an old H.P. distortion analyser from Les at Avondale especially for the job but in the end it became apparent that the distortion analyser I'd bought wasn't up to the job and I abandoned the project.
I now have an old analogue CRT Iwatsu spectrum analyser and the Denon DCD-3300 has rekindled my interest in having a go at this again.
Unfortunately I no longer have any facility to burn a CDR with an appropriate test tone and so the first thing I need to do is find someone who'd be willing to do this for me. I think what I need is a long track of (say 10 mins at least) of a -60dB 1Khz sine wave test tone.
I'm unsure about whether or not this sine wave test tone needs to be dithered or not. Any advice on this would be greatly appreciated.
I've just repaired an old Denon DCD-3300, which a lovely TOTL vintage player. It uses the Burr Brown PCM56 (one per channel) and has the facility to adjust the MSB error via a trimmer.
I understand the basic principle would be to play a test tone sine wave from a CD whilst observing the output on a distortion analyser or spectrum analyser and and then adjust the trimmer for lowest distortion (distortion meter) or lowest harmonic 'spikes', especially 3rd and 5th order harmonics (S.A.).
From reading around it seems most people use a 1KHz frequency. I don't know if there's any particular significance to this, I suppose it's in the critical audio spectrum and happens to be a convenient round number.
Again, reading around it seems most people use a very low level signal for this, typically -60dB. However, the service manuals for the Sony CDP-X7esd and CDP-557esd, which have similar adjustment facilities states 1KHz @ 0dB. The Sony service manual calls for a distortion analyser and not a spectrum analyser. I attempted the Sony procedure a few years ago and bought an old H.P. distortion analyser from Les at Avondale especially for the job but in the end it became apparent that the distortion analyser I'd bought wasn't up to the job and I abandoned the project.
I now have an old analogue CRT Iwatsu spectrum analyser and the Denon DCD-3300 has rekindled my interest in having a go at this again.
Unfortunately I no longer have any facility to burn a CDR with an appropriate test tone and so the first thing I need to do is find someone who'd be willing to do this for me. I think what I need is a long track of (say 10 mins at least) of a -60dB 1Khz sine wave test tone.
I'm unsure about whether or not this sine wave test tone needs to be dithered or not. Any advice on this would be greatly appreciated.