Originally posted by bxf Assuming this phenomenon (i.e. the inconsistent relationship between focusing distance and sharpness) is reasonably common, this statement actually answers much of my question. i.e. I was trying to understand whether such an inconsistency is a simple reality of lens design limitations, or an optical/visual fact. I trust I'm making sense here. Perhaps it is clearer if I put it like this: I was wondering if more distant subjects will necessarily always appear less sharp, even if the lens IQ is a prime, and is reported as being consistent, though I suppose this would be determined by the lens testing methodology.
If the above is still somewhat incoherent (and I concede that it may be), consider this: your own eyes will resolve closer objects better than more distant objects, even though it's the same pair of eyes. It occurred to me that similar limitations may exist with lenses.
Most lens tests are conducted at a distance of a few meters, basically mid-range,
so as pcarfan noted, those tests don't tell the full story.
Lens designers consider optimizing their lenses for different distances,
sometimes depending on the intended use of the lens.
A "landscape" lens, for example, would be optimized for infinity,
while a "macro" lens would be optimized for close-up.
Tricks like "floating elements" can help make a lens good
over a whole range of distances.
When thinking about how healthy eyes see, near versus far,
it's better to think in terms of resolving angles,
rather than resolving on some linear scale like millimeters.
That way, resolving a millimeter at a distance of one meter
is the equivalent of resolving one meter at a distance of a kilometer.