Originally posted by Parallax That also makes sense.
That's the one answer to start from. Smaller pixels make you "microscope" the defects without degrading the image as a
whole.
Pixel quality and image quality are two different things. Btw, that's a common source of misunderstanding when discussing depth of field as well.
So, the initial assumption (that a higher resolution sensor may deliver worse images) doesn't hold true. Actually, it may hold true if the fill factor degrades. But that's not currently the case with dSLRs, except maybe in corners below f/1.8.
After this has been clarified somehow, you may want to have a look at
LumoLabs Article -- Understanding Image Sharpness
It explains how a better sensor may even help a lens which is already "outresolved". Think of lens and sensor both having a filter layer which adds fuzzyness. What you see is the combined (added) effect from looking through both layers. And a higher resolving sensor is like a thinner filter layer, helping (a little bit) even if the other layer is thicker already.
---------- Post added 11-22-10 at 07:09 PM ----------
Originally posted by Eruditass A higher resolution sensor will never produce an image inferior to a lower resolution sensor, all else equal.
Here's a graph I made, where the perceived sharpness is the intersection of the 3 curves:
Eruditass, nice graph.
However, the overall perceived sharpness depends on all three factors in a more complex way. The correct treatment would be the multiplication of all three MTF curves describing each effect. Of course, that's rather complicated. In my paper (cited above) I try to provide simplifications.
Probably the best simplification would be to introduce a blur (the inverse of resolution) or fuzzyness (the square of blur or surface of a blurred point).
The statement then simplifies to saying that the overall fuzzyness is the sum of fuzzyness due to missing lens quality, finite sensor pixel size and diffraction. If you replace fuzzyness by blur, then this is a rule aka Kodak formula. However, I found fuzzyness allows a better treatment for digital devices which can sharpen.
And as being a sum, reducing the finite sensor pixel size will always reduce overall fuzzyness. And overall fuzzyness can never become smaller than the finite sensor pixel size.
This statement isn't mathematically exact but as I tried to show, the best possible simplification.