DevillEars
Dedicated ignorer of fashion
One aspect that needs to be taken into account is the differences in rate and nature of the separate evolutions of transports and DACs and their key "bought-in" components.
Taking each individually and starting with transports:
The major bought-in element has, from the start, been the optical mechanism comprising spindle, disk support, and speed control servo circuitry plus the optical read circuitry with its error detection/correction logic.
The OEMs like Philips, Sony, Sanyo, Teac, etc. have - between them - driven (and been driven by) the conflicting (and evolving) priorities of quality and price-point which has resulted in finite product life cycles for these bought-in elements.
External (to the audio market) factors such as the explosive growth of optical disk peripheral usage in computers and, more specifically, in PCs, has had its impact with TEAC switching focus almost completely to computer drives (while leaving their Esoteric (audio-only) division to drive their VRDS Neo range of mechanisms (which are prohibitively costly for use in any but the most expensive consumer products (eg dCS).
Philips was also not immune to these market forces, and their early star optical mechanisms (CDM-4, CDM-9 and CDM-9PRO) were casualties and subsequent replacement mechanisms were even more short-lived.
Due to the conflicting priorities and different consumption volumes between computer and audio markets, PLUS the drop-off in pure audio-only Red-Book-compatible mechanisms, it would appear that the computer market is the primary driver for these OEMs.
As a result, many CD transports and players have wound up using "universal drives" initially targeted at the PC market and tweaked for use in the audio market. These mechanisms have extended initial "disk inserted to disk ready" timings due to adding the required logic to identify the type of disk (one aspect), to slot-load mechanisms with below par disk clamping, to higher levels of jitter (due to mass market production standards).
In summary, the overall quality of these key components has declined.
Meanwhile, in the DAC world, where there lies another key "bought-in" element - this time the DAC chip used.
Early DACs tried to address the issue of jitter by introducing links to synchronise the clocks in transport and DAC by slaving the DAC clock to the clock in the transport.
Over time, DAC manufacturers looked at other answers to the problem of jitter and, after some experiments with interface devices, started to implement re-clocking the digital signal in the DAC in their attempts to provide improvements in SQ despite the dropping quality of the digital feed (that was driven by the growth of the PC market and its impact of optical mechanism quality).
The nett effect on the DAC market was a continuous cycle of improvement to counter the situation in transports.
The third element was the mechanism used to interface the transport to the DAC.
This was, initially, a two-part approach as separate CDT/DAC links tended to follow either the S/PDIF or AES/EBU electrical interfaces and the Toslink or AT&T optical interfaces.
Also, early DAC designs were focused on CD only (i.e. 16-bit/44.1KHz) but, as new source media standards came along (DAT at 48KHz sample rate), DVD-A (typically at a 96KHz sample rate) then SACD which added the need for a DSD interface, to high-res computer digital audio files (at up to 24-bit/192KHz and higher), DAC manufacturers have expanded their DAC units to provide multiple switchable inputs covering each additional input type.
The S/PDIF interface (a component of the Red Book standard) called for the entire electrical digital signal to be transmitted via a single conductor or pair.
If we ignore optical links, this interface with the need to carry the digital audio signal (at 16-bit resolution and a 44.1KHz sample rate), PLUS timing data, CIRC data, and other bits and pieces wound up with a total sample size (and required feed-rate) that approached the then limits of data transmission speeds.
As a result, many integrated players ignored the Red Book guidelines for INTERNAL links between their transport and DAC sections and implemented a 4-conductor interface (I2S) in an attempt to avoid having to operate at too close the transmission speed limits per conductor. (Audio Alchemy was one of the pioneers in using I2S in separate CDT/DAC configurations via the use of proprietary interface cables).
This approach also made the implementation of over-sampling by providing the necessary bandwidth to cater for the increased data rate implicit in such designs.
Other manufacturers have, in recent years, introduced variations on the I2S interface between CDT and DAC - one notable example is PS Audio in their PerfectWave and DirectStream ranges.
Against this background, the importance of the transport in the chain would appear to be diminishing as the DACs and interfaces improve.
The question of whether or not all transports sound the same remains...
Here one needs to consider the permutations of combinations...
a) where both components are bought new and from today's offerings, given the current technology, different transports are less likely to have a significant impact on SQ (assumption)
b) where DAC is current and CDT is not current, switching transports is also less likely to have a significant impact on SQ (another assumption)
c) Where DAC is not current (assume from early designs prior to DAC enhancements to counter failings in transport) then logic would imply that using a higher quality transport from the same vintage as the DAC would offer a better SQ than using a modern non-high-end transport (deduction).
After 42 years in IT, I'm still a believer in the principle of GIGO (Garbage In = Garbage Out), and I've yet to see any evidence that proves beyond all doubt that modern DACs do not benefit from improved quality of digital input signals.
Dave
Taking each individually and starting with transports:
The major bought-in element has, from the start, been the optical mechanism comprising spindle, disk support, and speed control servo circuitry plus the optical read circuitry with its error detection/correction logic.
The OEMs like Philips, Sony, Sanyo, Teac, etc. have - between them - driven (and been driven by) the conflicting (and evolving) priorities of quality and price-point which has resulted in finite product life cycles for these bought-in elements.
External (to the audio market) factors such as the explosive growth of optical disk peripheral usage in computers and, more specifically, in PCs, has had its impact with TEAC switching focus almost completely to computer drives (while leaving their Esoteric (audio-only) division to drive their VRDS Neo range of mechanisms (which are prohibitively costly for use in any but the most expensive consumer products (eg dCS).
Philips was also not immune to these market forces, and their early star optical mechanisms (CDM-4, CDM-9 and CDM-9PRO) were casualties and subsequent replacement mechanisms were even more short-lived.
Due to the conflicting priorities and different consumption volumes between computer and audio markets, PLUS the drop-off in pure audio-only Red-Book-compatible mechanisms, it would appear that the computer market is the primary driver for these OEMs.
As a result, many CD transports and players have wound up using "universal drives" initially targeted at the PC market and tweaked for use in the audio market. These mechanisms have extended initial "disk inserted to disk ready" timings due to adding the required logic to identify the type of disk (one aspect), to slot-load mechanisms with below par disk clamping, to higher levels of jitter (due to mass market production standards).
In summary, the overall quality of these key components has declined.
Meanwhile, in the DAC world, where there lies another key "bought-in" element - this time the DAC chip used.
Early DACs tried to address the issue of jitter by introducing links to synchronise the clocks in transport and DAC by slaving the DAC clock to the clock in the transport.
Over time, DAC manufacturers looked at other answers to the problem of jitter and, after some experiments with interface devices, started to implement re-clocking the digital signal in the DAC in their attempts to provide improvements in SQ despite the dropping quality of the digital feed (that was driven by the growth of the PC market and its impact of optical mechanism quality).
The nett effect on the DAC market was a continuous cycle of improvement to counter the situation in transports.
The third element was the mechanism used to interface the transport to the DAC.
This was, initially, a two-part approach as separate CDT/DAC links tended to follow either the S/PDIF or AES/EBU electrical interfaces and the Toslink or AT&T optical interfaces.
Also, early DAC designs were focused on CD only (i.e. 16-bit/44.1KHz) but, as new source media standards came along (DAT at 48KHz sample rate), DVD-A (typically at a 96KHz sample rate) then SACD which added the need for a DSD interface, to high-res computer digital audio files (at up to 24-bit/192KHz and higher), DAC manufacturers have expanded their DAC units to provide multiple switchable inputs covering each additional input type.
The S/PDIF interface (a component of the Red Book standard) called for the entire electrical digital signal to be transmitted via a single conductor or pair.
If we ignore optical links, this interface with the need to carry the digital audio signal (at 16-bit resolution and a 44.1KHz sample rate), PLUS timing data, CIRC data, and other bits and pieces wound up with a total sample size (and required feed-rate) that approached the then limits of data transmission speeds.
As a result, many integrated players ignored the Red Book guidelines for INTERNAL links between their transport and DAC sections and implemented a 4-conductor interface (I2S) in an attempt to avoid having to operate at too close the transmission speed limits per conductor. (Audio Alchemy was one of the pioneers in using I2S in separate CDT/DAC configurations via the use of proprietary interface cables).
This approach also made the implementation of over-sampling by providing the necessary bandwidth to cater for the increased data rate implicit in such designs.
Other manufacturers have, in recent years, introduced variations on the I2S interface between CDT and DAC - one notable example is PS Audio in their PerfectWave and DirectStream ranges.
Against this background, the importance of the transport in the chain would appear to be diminishing as the DACs and interfaces improve.
The question of whether or not all transports sound the same remains...
Here one needs to consider the permutations of combinations...
a) where both components are bought new and from today's offerings, given the current technology, different transports are less likely to have a significant impact on SQ (assumption)
b) where DAC is current and CDT is not current, switching transports is also less likely to have a significant impact on SQ (another assumption)
c) Where DAC is not current (assume from early designs prior to DAC enhancements to counter failings in transport) then logic would imply that using a higher quality transport from the same vintage as the DAC would offer a better SQ than using a modern non-high-end transport (deduction).
After 42 years in IT, I'm still a believer in the principle of GIGO (Garbage In = Garbage Out), and I've yet to see any evidence that proves beyond all doubt that modern DACs do not benefit from improved quality of digital input signals.
Dave