I've read that the difference in colour rendition between CCD and CMOS sensors is just a myth. Apparently, the visible differences comes down to a change in the colour filters, which happened to coincide with the emergence of CMOS sensors. Basically, the Red, Green, and Blue filters used to be more separate (less overlap of light wavelengths), which caused the primary colours to already appear very saturated in raw images. With CMOS sensors, in an attempt to better approximate our own cone cells, and perhaps to improve low-light performance, filter arrays were updated for each colour filter to cover a wider range of wavelengths that overlap with each other. This has had the effect of causing raw images to appear less saturated, though probably more natural and in-line with our own vision.
I don't believe there's any inherent difference between CCD and CMOS tech that could cause a difference in colour rendition, as the fundamental difference between the two has only to do with the way the captured electron charge is read out, but there isn't any difference in the actual photosensitive element. In theory, placing a modern colour filter array over a CCD sensor should provide an equivalent rendition to the same array over a CMOS sensor.
Edit: Maybe there has also been advancements in photodiode technology that could explain rendition differences, but I doubt it since photodiodes simply are effectively monochrome and unbiased in what they capture. Regardless of what photosensitive element is used, it will still only capture whatever wavelength is allowed to reach it, which means that the only way to change an image is to transform the light before it is captured, or to process the captured image afterward. The sensor itself doesn't have a say.
Last edited by StarTroop; 12-26-2019 at 05:54 PM.