Originally posted by Fogel70 The amount of light that images are made of is: illuminance x sensor area x shutter speed.
Yep, that's the crux of the problem right there. So let's amend it to state how a digital camera actually works:
The amount of light that images are made of is: illuminance per pixel x shutter speed.
On each pixel, a photodiode converts photons into electrons. Then (on each pixel on a CMOS sensor) readout electronics measure the accumulated charge. Then an analog-to-digital converter turns that measurement into a digital value. A bigger surface area of the photodetector per pixel will convert more photons into electrons, so that the charge that we actually want to measure will be easier to distinguish from the sensor's inevitable background noise. The sensor's dynamic range will be higher.
Now there's two ways to get that bigger surface area per photodetector. You can have a smaller sensor with fewer megapixels but a bigger surface area per pixel, or you can have a bigger sensor with more megapixels and also a bigger surface area per pixel. But whatever balance you decide to strike between megapixels and individual photodetector surface area, the signal-to-noise ratio is determined at the pixel level. The only thing that's affected by the amount of light hitting the overall surface area of the sensor is exposure, and all but the most lunatic fringe of equivalentists accept that exposure is the same on different sensor formats.
So can we please be clear about this once and for all: a lens that collects more light has no direct impact on signal-to-noise ratio. A lens with a wider maximum aperture means that you can get away with using lower ISO in lower light, and indeed that does have an impact on the signal-to-noise ratio. But only because of the lower ISO, which means less amplification at the pixel level.
I notice that magnification has also been mentioned, so let's go back and look at those pixels again. Remember that the accumulated charge per pixel is measured and converted to a digital value? The analog-to-digital converter doesn't include the physical size of the pixel, and certainly not the physical size of the sensor, in the data it records. It doesn't need to. All it needs is a digital value that records the measured charge for each individual pixel of whatever resolution in megapixels the sensor happens to have.
And now, finally, you want to look at the photo on your monitor. Those digital values per pixel from the sensor are downsampled to your monitor resolution and displayed as a luminance level (and of course a colour) per pixel on the monitor. The physical size of the sensor that took the photo, and even the physical size of the monitor, are irrelevant. All that matters is the pixel resolution. If you had a 24 megapixel APS-C sensor and a 24 megapixel FF sensor with the same signal-to-noise ratio per pixel, the resulting images would be indistinguishable in terms of noise.
The only time you are ever magnifying or enlarging a digital photo is if you upsample it to a higher viewing resolution than the original sensor resolution.
The thing is, there's an actual way that cameras work, and it's well understood, and it isn't open to debate. It's just the way they work.