What is the practical difference between phase-detect and contrast-based autofocus?
- 9 years ago
To put it into manual focus terms, contrast-detect autofocus is like trying to focus an image on a plain ground-glass screen, while phase-detect is like using a split prism focus aid or a rangefinder. In the one scheme, you are looking for a local maximum on a gentle curve, while in the other, you're just looking for things to line up. It's a lot easier to decide when things are lined up than when things are maximally contrasty.
Now, electronics can make the absolute maximum contrast determination faster than we can, since they can be sensitive enough to go back the moment the contrast curve begins to fall, but that's still not quite as easy as comparing two images to see if they line up. And, since a phase-detect system knows which image is which, it should always know in which direction it needs to focus to make the correction. It's always a guess with contrast detection -- you focus in one direction, and if it gets worse instead of better, you reverse direction.
That said, some of the cameras they're making these days have incredible electronics, so there might not be an appreciable difference to the average photographer. Either way, driving the lens (and the backlight for the monitor, if you're using one) is going to be the main source of power drain. Yes, reading the entire imaging sensor is going to "cost" more than reading a few (or a single) specialized autofocus sensors, but is it a difference you're going to notice? Probably not. The real drain with contrast-detect AF is usually that you can't use an optical viewfinder, so the monitor (or EVF) is active the whole time, not that the focus system itself is running.
I like your analogy but I wonder if reading a videostream from the sensor like liveview isn't a very significant drain on the battery, vs keeping it off completely and reading those AF points (and even reading those much fewer times)?