Originally posted by stevebrot I have found it disturbing how common the bad copy phenomenon seems to be in regards to reviews of Pentax lenses.
It's not just Pentax lenses. It's not entirely uncommon for tests results online not to match up with user experiences, regardless of brand. And, oddly enough, it goes in both directions as well, with some lenses scoring ridiculously high on tests, others suspiciously low. Just look at ephotozine's
test of the Olympus 14-150. No way does that lens have the edge sharpness that the tests show. Even the full rez sample images are not consistent with their test!
I find the same issue plaguing ephotozine's "test" of DA* 60-250. Look at the full rez sample photos. They are among the sharpest, and are easily the most impressive, of any of the sample images I've seen from any of ephotozine's tests, and I've gone through quite a number of their tests. The DA* 60-250 shots are quite sharp, with superb microcontrast. They are entirely inconsistent with ephotozine's own tests.
Is it really sample variation that's the problem here? Or is it tester error/incompetence? Why are tests from various sites so often at odds? Why are the tests at odds with sample images and user experiences? Is sample variation really that extreme and significant? If so, then these online tests are pretty much useless, because you'll never know whether they're testing a good or bad copy of the lens being tested.