The iPhone XR's Camera app will offer the same sort of bokeh (depth-of-field) post-processing effects as the iPhone XS, but calculate depth "almost entirely" via algorithm when shooting with the rear camera, according to a newly-published interview.
The XS and XS Max take advantage of their TrueDepth sensors on the front and dual-lens cameras on the back to gauge depth, said Graham Townsend, Apple's senior director of Camera Hardware told journalist Lance Ulanoff. While the XR does have TrueDepth on the front, it only has a single lens on the rear.
In the past Apple's single-lens iPhones have been unable to tap iOS's Portrait mode, which simulates the bokeh of a DSLR lens by detecting a subject and artificially blurring the background. That system is enhanced on Apple's new iPhones to not only let people scale the amount of bokeh, but mimic the way it works on a real camera lens.
"We turned the model of a lens into math and apply that to that image. No-one else is doing what this does," said Apple VP of Software Camera, Photos, and Security Sebastian Marineau-Mes. "Others just do 'blur background."
The inclusion of Portrait shooting on the XR is one of the factors that should keep the phone popular despite the technical superiority of the XS line. Foremost is cost, since the XR starts at $749 versus the $999 for the XS and $1,099 for the XS Max. Though the XR uses LCD instead of OLED, and lacks 3D Touch, it still has an edge-to-edge screen, Face ID, and an A12 processor.
Some analysts have forecast that the XR will greatly outsell its siblings. That may be supported by rumors, which have claimed that the device will account for 50 percent or more of Apple's production orders.
4 Comments
Once again I’m impressed Apple.
I mean, isn't it always an algorithm? I've seen people say that it somehow uses both cameras, but how do you stitch images of completely different focal lengths together? It's never made sense to me.