Originally posted by RiceHigh
Nope. Adding up and averaging out two or even more *noisy* pixels will not give you a noiseless pixel data. The Physics is only one larger pixel, with more area and full well capacity in receiving light and store more charge, will have less noise, i.e., a higher S/N ratio.
I think you should think about Eric Fossum's words....
Re: Pixel Density: when Moore is less: News Discussion Forum: Digital Photography Review
I don't disagree with your analysis, but there are two factors that you may not be considering.
First, as pixel sizes shrink, light gathering technology (microlenses) and quantum efficiency both are typically improved. So, while mathematically there may be no improvement in SNR as pixels sizes are shrunk (incl. spatial averaging), in fact due to QE improvements etc., there is an apparent improvement with pixel shrink.
If you care to argue w/ the inventor of the active pixel sensor go ahead.
Also an interesting paper:
http://www.imagesensors.org/Past%20Workshops/2007%20Workshop/2007%20Papers/0...%20et%20al.pdf
All of which stemmed from this thesis presented by Emil J. Martinec and not refuted by Mr. Fossum.
If we look at the numbers for middle grey, which are largely due to photon noise, we find that the D40X noise figures are uniformly about 30% higher than those of the D40 up through ISO 1600.
So one would be led to believe that the 'terrible tradeoff' is that the D40x gets more resolution than the D40, but at the expense of being a rather noisier camera. Indeed, the casual visitor to DPReview considering the purchase of one of these cameras would be faced with a real dilemma -- do they go for the extra resolution, but then what about the extra noise that comes with it?
Of course, this is a false choice. The point is that the noise is measured in these reviews at the pixel level, without any context as to the spatial frequency at which it's being measured (it's always measured at the pixel level, also known as the Nyquist frequency) which of course varies with the pixel density. If one does the math, photon noise scales in inverse proportion to the pixel pitch (the number of photons captured scales with the area, the photon noise is the square root of the number of photons, so scales with the square root of area, ie the linear dimension of the sensor real estate). So with 67% more pixels, the D40x pixel pitch is 30% smaller than the D40 pixel pitch, and there's the explanation of the 30% noise difference at the pixel level.
But this is also entirely misleading. Since noise scales with the linear dimension of the region of sensor considered, what would happen if one resampled the D40x image to 6MP? Well, one would combine the data from 1.67 D40x pixels to make the pixels of the resampled image, and the noise would reduce by about 30% at the pixel level of the resampled image (assuming the resampling is done properly). In other words, viewed at the same image size, the D40x and the D40 have the SAME amount of noise. The only difference is that the D40x need not be resampled to deliver this same level of noise, it merely needs to be viewed at equal size. Oh yes, there is one additional difference -- the D40x has 30% higher resolution while having that same level of noise.
So, I can imagine the route by which arrives to the conclusion that the megapixel race is a bad thing, it is a viewpoint entirely reinforced by this site's testing methodology which concentrates on pixel-level noise without (say) rescaling it to noise as a percentage of frame height. And it can lead the consumer to make bad choices, for instance by not realizing that the D40x and D40 have the same level of image noise, if they are noise-averse they may decide to go for the D40 when there is absolutely no reason to do so.
Re: Pixel Density: when Moore is less: News Discussion Forum: Digital Photography Review
apparently physics says one thing, engineering says another.......