1. Responsive Capture
With iOS 17, Apple introduced a few new technologies that allow photos to be taken much faster – improving shot-to-shot time while maintaining the highest possible quality. We’re happy to announce that these have all been implemented in our Pro Camera app – download or update here!
Why does this matter?
These features are best suited for action shots and spur-of-the-moment captures. They improve results in two somewhat subtly-different ways:
- Zero shutter lag: At the time you press the shutter button, what you see in the viewfinder is now what you’ll see in the captured image. The result is now limited only by your reflexes rather than being subject to processing latency as well.
- Capture pipeline improvements: After an image is taken, the camera system is much faster to “recover” and be ready to take another. This reduced delay means you can take more images of a moving subject faster, hopefully acquiring a frame that contains the action or composition you want to capture.
We’ve made these two features available automatically, but there’s also a third optional feature: fast capture prioritization. If you want to ensure you don’t miss a shot, you can opt-in to this behavior (called ”Prioritize Faster Shooting” in our app settings). This will temporarily increase shutter responsiveness at the expense of slightly reduced image quality.
How does it work?
Responsive capture pipeline
Before iOS 17, an image had to be fully processed before the next image could be taken. Now, two new features work together to mitigate this latency. Responsive capture allows multiple photo captures to operate simultaneously, rather than waiting for one to complete entirely before taking the next photo. Further, with deferred photo processing, some of this processing can occur in the background, freeing up resources for use in grabbing new images.
Zero shutter lag
As of iOS 17, your phone maintains a circular buffer of incoming sensor data while the camera is active. When you press the shutter button, the camera system can grab frames from the past (just before you pressed the button) to merge together into a single high-quality image, rather than only capturing frames after you press the shutter button. This means you capture the subject as it was when you decided to take a photo rather than in the state it was after pressing the shutter. For any audio geeks out there, this feature is conceptually similar to retrospective recording.
Fast capture prioritization
When the camera detects faster-than-average shutter presses, it can choose to temporarily reduce photo quality (i.e., turn off Deep Fusion and other processing) so that images take even less time to be saved. The system is ready to take another photo that much quicker.
For more technical details, see the WWDC session here.
Overall, the difference is striking; you can see the results below. We compared two phones: a brand-new iPhone 14 Pro running iOS 16 and an older iPhone 11 Pro running our iOS 17 update. Both had the image quality set to the highest possible setting, and both had “Prioritize Faster Shooting” turned off, so they took maximum-quality images the whole time. Despite its weaker processor, the iPhone 11 could take many more photos in a shorter period. Meanwhile, the iOS 16 device was left playing catch-up: it kept trying to take images slowly, even after we stopped pressing the shutter button.