Originally posted by dcshooter CDs quantifiably have much much higher sound fidelity
When CDs first came out, I believe this was false in one particular respect. Early digital recording equipment had low sampling rates ( AFAIK, it was about the same as that of the CDs themselves ). Therefore, the analogue anti-aliasing filters used in their front ends ( upstream of the A/D converters ) would introduce massive phase distortion in the audible range. This would apply to new recordings, and to older analogue recordings that were converted to digital format for issue on CDs.
It's been many years since I studied DSP, but I recall that this phase distortion would have to be present - it's all in the math. Could a discriminating listener HEAR this distortion? Hard to say. Phase information is important for how we locate the sources of sound in 3 dimensions. Assuming that a recording is made very carefully in the analogue domain, such that all the pertinent phase information is preserved with minimal distortion, it could theoretically be superior in this respect to the same recording made with a analogue->digital->analogue conversion somewhere in the signal path.
Note that the complaint audiophiles had about CDs is that for some recordings, they lacked the perceived 3D effect found in the corresponding analogue recordings. The argument, as I recall, was that if everything was set up just right, you could pick out exactly where the original performers were located in 3 dimensional space. With the identical recording played off a CD, this audio image was not as clear.
But everything would have to be "just so" in the pure analogue case for this discrepancy to have a demonstrably negative effect on a recording's fidelity. This would not likely be the case in your average musical recording.
Nowadays, it should be possible to record digitally at a much higher sampling rate than what was achievable with first generation digital recording equipment, so you can process the signal differently such that there would be less ( or possibly even no ) phase distortion in the audible range. So in theory, you should be able to avoid this problem with more modern digital recording equipment. Are they currently using these faster sampling rates in modern recording studios? I don't know.
Bottom line - when you consider the digital recording equipment in use when CDs first came out, in theory, you could have made a better fidelity recording with pure analogue recording equipment ( if you believe that phase distortion can have a negative effect on the perceived quality of the recording ).
Many recordings were probably not made carefully enough for this difference to matter, and the average listener doesn't play recordings back on the kind of equipment/setup where you'd be able to tell the difference.
From a pragmatic perspective, for most listeners, it's no contest - CDs are better ( more convenient, more durable, better quality control, less expensive, etc. ). But that does not mean that there is not a kernel of truth in the argument given by the audiophile that an analogue recording can be superior.
Maybe this is a pedantic argument, but I believe there was ( and possibly still is ) a
quantifiable discrepancy in recording fidelity in favour of analogue recordings vs CDs. Sure, for most people, it probably wasn't worth pursuing, but that doesn't mean it didn't exist.
Returning to cameras, it may be that under ideal conditions ( bright light, etc. ), the CCD sensor has some subtle advantage over an equivalent CMOS sensor that doesn't show up in test results, but people can discern it. We can only quantify what we decide to measure. If we don't go to the trouble to measure something it doesn't mean it is not quantifiable. Is it possible that a different sensor technology responds differently to subtle shades of colour? I don't know. Does anyone ever test it?
But CMOS performs better under non-ideal conditions, so for the average user, even if CCD works better under some conditions, it's probably better to take the trade-off and get a camera with a CMOS sensor. It's going to be more versatile. That doesn't mean that someone can't make the argument that under certain conditions, the CCD sensor produces 'better' results - whatever 'better' means to their eyes.
I used a K200D for a long time ( same sensor as the K10 ). I think that it may produce better photos than my K30 under ideal conditions ( ignoring the difference in resolution - and I don't know if there is a resolution limitation inherent in the CCD technology ). But I prefer the brighter viewfinder, dual control dials, high ISO performance, TAv mode, etc. etc. of the K30.
I prefer using my K30, but I acknowledge that under certain circumstances, the K200D may produce subtly better images.
Maybe. It could be my imagination, because I haven't tried to quantify it, or perform rigorous A/B comparisons. I'm not sure how I would go about doing so in any case.
---------- Post added 01-24-2015 at 01:19 PM ----------
[/COLOR]
Originally posted by normhead I suppose you have some example images that show us what you're talking about? The strong point for me was the dynamaic range of the K-5. At 100 ISO there was a lot more retrievable shadow detail. I didn't have a K10, but I had a K20D. After my wife , who was using a K-x started using my K-5 we bought a second one, based on side by side shooting. That is, we were both shooting the same sunsets and scenery, and the difference between the CCD and CMOS shadow detail was noticeable. So in terms of Dynamic Range I'm sure the CCD is inferior. For those images, which for some people is a very high percentage, if the whole histogram takes up less than the entire graph, meaning dynamic range falls within the capability of both sensors, then you can discuss rendition. But I'd still like a situation with both cameras shooting side by side and some comparison photos to try and pick out a difference.
We saw practically no difference in rendition, but a real difference in Dynamic Range.
K20D, K-x, and K5 all have ( different ) CMOS sensors:
https://www.pentaxforums.com/forums/pentax-cameras-compared/?c1=k20d&c2=kx&c3=k5
Only the K5 has a 14 bit D/A converter.