Quote: I think it means they added a new light source to the data range of phase-detection AF.
No.
Quote: Actually, since they said 'data range' rather than 'detection range,' it's quite possible that they *didn't* add a new sensor, just the ability to receive and process more useful information from *a* sensor. Could even be the very same one, that just previously hadn't been fully utilized.
No. The sensor is clearly shown and discussed as being a color temperature sensor.
Quote: Could have. . If having more useful data makes a positive lock easier under whatever set of conditions, then, there's less dithering to do. It's kind of like radar: a bigger antenna, so to speak.
Bad analogy, but there is some truth in this.
Quote: An algorithm is a mathematical process that is the means by which something like AF data is turned from a bunch of light falling on a sensor, into commands for something like the AF or SR.
If there's a more-efficient way to get the same results out of fewer computer cycles, (and fewer times blipping the lens to get more data) yes, it happens faster. This is why computer speed is measured in Hertz.
No offense intended, but I am quite aware of what an alogrithm is.
Quote: Agreed. But this is also all about the math the computer does and what information it has to process. There isn't like a separate 'predictive AF machine' someone sticks in a camera, ...just more math.
Yes and no, but mainly no. Other brands use many AF sensor points, many of them hidden, to create a predictive AF system that includes hardware (more sensors) and AI-like algorithms to manage the AF tracking. It is more like a closed-loop servo system.
The hidden sensors pick up and start tracking the object and because there are enough of them, they have enough data to start trying to predict where the object being tracked is headed. They hand off this work to other sensors as the object moves (so to speak) and keep passing this information to the AF system which drives the lens to where the object is PREDICTED to be, not where it was or is. At some point in the process the main AF sensors lock and you can take the image, but in the meanwhile the lens has never stopped driving to the focus point that the hardware and the software predict the object will be.
Quote: Speed improvements come from that being fast and accurate, and from the machinery being able to respond as quickly as possible to it. The machinery can't go any faster than the calculations, though. So improving the calculations ....either with more computer power, a broader range of data, or more efficient ways to process it, certainly pays dividends.
Of course faster processing will always improve such a system, but all of this has little or nothing to do with low light focusing. If the processing were the problem in low light, the system would not be any faster at higher light levels than it is at low levels. The AF hardware has to be the main hold-up in low light.
You could, however, have the exact same arrangement as the K20D but emply 2 year newer AF sensors that are much more sensitive to light and get improvements in low light without making any other huge change to the overall system.
Ray