I see that everybody seems to take it as granted that sensor AF as done by all SLRs including DSLRs is far better than "contrast AF" where the actual sensor data is used.
Can anybody provide arguments for this?
I cannot see a reason. I see that SLRs with their lack of a sensor had no other choice and early DSLRs probably just continueed to use what was already there. And then everybody noticed that "contrast AF" would darken the VF (mirror up), so not much research may have been invested in this direction. But none of this is an argument for one or the other method.
I've once implemented a "contrast AF" used on microscopes meant to replace the infrared laser AF commonplace in some of those systems.
Well, I ended up applying a smarter method than just using an image's contrast which I cannot disclose here. All I can say that it performed extremely fast, accurate and reliably with only a few "re-focus" steps involved (about 3-4). It easily met the hardware infrared laser AF specs.
I understand that photography may be different because lenses may be slower to re-focus and it may take much exposure time to take each snapshot in turn. On the other side, microscopes are harder to do as well because they have a DoF of a few micrometers only (which means manual focus is very hard while you see nothing more than gray soup...).
So, ideally supported by traditional focus sensors to predict the direction of re-focus, "contrast AF" should be faster, more reliably and far more accurate than the traditional method.
Anybody knows why it isn't done?
Last edited by falconeye; 03-01-2008 at 08:25 PM.
Reason: Removed irrelevant detail