Originally posted by falconeye [...]However, and this is how I did it in my algorithm, two measurements tell you the direction and a third measurement gives you an approximate magnitude.
With the kind of sensor I am assuming (full pixel 60fps device), you would constantly measure contrast, at the second frame (i.e., after 30ms) you would know the direction and while driving into focus, you would "see" it coming and make a "soft landing" at the exact optimum focus point. [...]
So, contrast detect needs more measurements but each individual measurement can be made much faster because it can use much more light. Given a fast enough sensor and processor, contrast detect eventually wins.
Three measurements and we're talking about "approximate magnitude"; assuming new sensors would be developed (60fps full image read-out, even in low light)... so we're talking about a distant future; OK then, I'm more concerned with the near future, and I don't see the contrast detection AF (while their best current implementation is comparable to entry level DSLRs) to beat phase detection AF so badly, in only few years
And even then, we'll have to talk about predictive AF
Sorry, I'm just not convinced this would be the better technology instead of a well-calibrated, high end phase detection AF.
About a moving sensor: I have my doubts about the SR, already (does the sensor plate remains perfectly calibrated in time?)
Focusing by moving the sensor - that would be done by moving it backwards, not in the mirror box; thus adding a lot of bulk. Even with a 645-style&size "compact EVIL" camera, it would lose close-up/macro capabilities for all but the shortest lenses, and on long teles you'll have two option: infinity and near infinity
There are a lot of good reasons why this shouldn't be done.