Originally posted by clackers Of course he would, you can't argue with physics, Fogel, a subject I used to teach for a living.
The amount of shot noise depends on and only on the number of photons that have hit the individual pixel
This is about as nonsensical as arguing that one should measure the speed of an automobile based on the revolutions of a wheel regardless of the diameter of the wheel, sure we could have 2 cars can travel at the same wheel RPM but the speed at which they travel will be different if these cars had different wheel sizes.
When evaluating a SNR you always have to understand the bandwidth it represents. Looking at a very narrow bandwidth will decrease your SNR. Looking at the SNR across a larger bandwidth will increase your SNR just because you are looking at the SNR with a narrower bandwidth does not mean that you have captured less SNR it just means that you did not have the acuity to see the variations when sampling while only using a wide bandwidth. With an imaging sensor we are able to capture all the data that falls withing the bandwidth
With imaging sensor with greater resolutions, one is able to view the image at narrower bandwidths, but this does not mean we lose SNR based on more noise, if we view that image based with a set bandwidth or resolution that signal represents very different areas within the image. the one viewed at a narrower bandwidth represents only a small portion of an image with less impact while the other when viewed with a wider bandwidth represents much more of the image with a greater impact on the image .
---------- Post added 10-03-2022 at 08:10 PM ----------
Originally posted by clackers Technology can only really change the much lower thermal noise contribution, or in the case of baked RAW files, by doing some sort of averaging out adjacent pixels which also destroys details
How is combining adjacent pixels and losing detail any different than not capturing the detail in the first place? We at not really combining pixels when we normalize, we are placing the recorded data in the area that data represents (a smaller area)
---------- Post added 10-03-2022 at 08:21 PM ----------
Originally posted by clackers The amount of shot noise depends on and only on the number of photons that have hit the individual pixel.
You do realize that the goal of one of the sensor designers I have been following has the goal of only recording were a single area who is hit by a single photon strikes that sensor, have any guesses as to how and what he predicts the sensors performance will be?
A hint he has been working on it for a few years with much more success than what is presently available. One photon per one photo site or Joint as he calls them