Originally posted by Gimbal It has now been stated by everyone, pixels are not used to measure size. Right, keep that in mind for the rest of this post. Pixels are not ….
So take for example an image projected on a 35mm frame/sensor and then viewed on a 24” screen.
Has the image size been reduced? (Remember, pixels are NOT a measurement of size, millimeter are.)
It has been enlarged.
If we now take an image from a APS-C sensor and display it on the same 24” screen, it has to be enlarged even more then the FF image to fill the screen. ENLARGED, even though it was down sampled. Pixels are NOT used to measure size.
The photodiode that converts the photons to electrons has a physical size, but once the charge has been measured and stored as a digital value, it's just a data point with no physical size. It isn't stored as part of an image of a certain physical size, it's stored as part of an image of a certain pixel resolution.
A line one pixel wide on a sensor does not get enlarged into a line multiple pixels wide on a computer monitor. At 100% it will appear as a line one pixel wide on the monitor, but after downsampling to fit the whole image on the screen it's entirely possible that it will just get downsampled away. Although actually downsampling algorithms are designed to try to preserve single pixel edges to retain a sense of sharpness.
The image on your computer screen is big because the individual pixels that make up the screen are big. You've now taken the data points that have no physical size and turned them into something that does have a physical size.
A digital camera sensor is a fundamentally different technology to a piece of film, and you have to be careful not to assume that they do things the same way. In many ways, the resemblance between the two technologies is so superficial that it's almost irrelevant.
Last edited by Dartmoor Dave; 08-23-2019 at 05:29 AM.