Originally posted by Jonathan Mac That's true at all. Eyes are still subject to the same laws of physics as cameras and lenses. The eye automatically focuses on what you look at. There are out-of-focus areas but as soon as you pay attention to them, and the eye moves there, they becomes in-focus areas. But they are there.
Obviously, individuals vary, we aren't subject to the same manufacturing tolerances as camera lenses, but the human eye is from what I have been able to glean, a 17mm f/2.2 lens with a minimum aperture of f8.3. As the retina is also about 17mm, the human eye is, more or less, a "standard lens".
As you have noted, the problem with seeing the out of focus stuff when you look at something is that the optical system works against us by refocusing on whatever we are paying attention to, and actively resisting allowing us to pay attention to that which we are not looking at. As well, we tend to scan stuff to build up an image, so we think we are seeing a wider field of view than we actually are.
In addition, a 17mm f2.2 lens that is standard for the format is going to have fairly deep depth of field wide open, and probably close to infinite depth of field in bright sunlight at it's minimum aperture of f8.8.
It makes it difficult to compare the human eye to a camera lens.
---------- Post added Mar 23rd, 2022 at 08:33 AM ----------
Originally posted by BigMackCam Bingo... and that's due to what is still fairly rudimentary depth-mapping. It's clever and cute, but still quite limited... though the results are usually good enough for social media...
Do keep in mind that this is a technology that is in ongoing development, and will continue to get more and more convincing as time goes on. Writing it off now would be akin to writing off the space program in the late 1950s because rockets kept exploding on the launch pad.