Originally posted by 24X36NOW The relationship is higher pixel density = more noise.
"More noise" where?
If you look at a single pixel, yes, it will be noisier.
If you compare the images at 100%, yes, the higher pixel density image will look noiser.
However, that's not a fair comparison, is it?
Or do you think that my K100D yields sharper images than a K-7? According to a 100% view comparison, it would yield sharper images because when I look at the K100D 6MP image at 100% it looks a lot sharper than the 14.7MP image.
A 6MP image will always have more contrast between adjacent pixels than a 14.7MP image (everything else being equal and excluding extreme test patterns). That's doesn't make the 6MP sensor the more acute one.
I hope the resolution example makes it clear that comparing images with different resolution (size) at 100% doesn't make sense.
A fair image comparison is performed on images with the same size. That either means downsampling the larger image or upsampling the smaller image (or variations thereof, such as downsampling with different degrees; printing them to the same size would be a practical method).
Originally posted by 24X36NOW In other words, the only way to reduce the noise is to lower the resolution to that of the lower resolution camera,
Note that downsampling a higher resolution image will not only reduce noise but also retain more detail of the scene, compared to the lower resolution image.
Originally posted by 24X36NOW In other words, there's no such thing as free lunch.
I guess we weren't talking about "lunch" then.
Your noise argument only applies when you compare images at their largest possible output size (for a given dpi boundary). A higher pixel density image can be printed larger and if you do it to the extent that the pixels become as big as the one in the lower resolution image then, yes, you'll see more noise. But this is comparing apples with oranges.