Photo by Amelia Holowaty Krales / The Verge
Deep Fusion, Apple’s anticipated computational photography system, is live in the latest iOS 13 public beta. If you’re enrolled, iOS 13.2 should be downloadable now after it arrived earlier today for developers. It’s available both for the iPhone 11 and 11 Pro.
Deep Fusion is designed to use artificial intelligence and other software tricks to improve the sharpness of images by capturing frames of differing exposures and merging them on its own.
The goal is to produce the highest-quality image possible. It’s supposed to only work for medium to low light scenes, whereas Smart HDR and Night mode handle extremely bright and extremely dark scenes, respectively. Apple marketing chief Phil Schiller calls it “computational photography mad science.”
In an interesting twist, if you happen to go looking for the feature, you won’t find it. According to Apple, Deep Fusion’s existence is purposefully hidden from users, and there’s apparently no visual indicator that it’s turned on, either on the camera app screen or in EXIF data of the resulting photos.
So how will you know whether Deep Fusion is used on your photos? Apple clearly doesn’t want you to worry about it too much, as the results should speak for themselves. We’ll have more to share about our own experiences using Deep Fusion in the coming days.