Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

iPhone XR uses algorithms to simulate depth effects found on iPhone XS & XS Max

The iPhone XR's Camera app will offer the same sort of bokeh (depth-of-field) post-processing effects as the iPhone XS, but calculate depth "almost entirely" via algorithm when shooting with the rear camera, according to a newly-published interview.

The XS and XS Max take advantage of their TrueDepth sensors on the front and dual-lens cameras on the back to gauge depth, said Graham Townsend, Apple's senior director of Camera Hardware told journalist Lance Ulanoff. While the XR does have TrueDepth on the front, it only has a single lens on the rear.

In the past Apple's single-lens iPhones have been unable to tap iOS's Portrait mode, which simulates the bokeh of a DSLR lens by detecting a subject and artificially blurring the background. That system is enhanced on Apple's new iPhones to not only let people scale the amount of bokeh, but mimic the way it works on a real camera lens.

"We turned the model of a lens into math and apply that to that image. No-one else is doing what this does," said Apple VP of Software Camera, Photos, and Security Sebastian Marineau-Mes. "Others just do 'blur background."

The inclusion of Portrait shooting on the XR is one of the factors that should keep the phone popular despite the technical superiority of the XS line. Foremost is cost, since the XR starts at $749 versus the $999 for the XS and $1,099 for the XS Max. Though the XR uses LCD instead of OLED, and lacks 3D Touch, it still has an edge-to-edge screen, Face ID, and an A12 processor.

Some analysts have forecast that the XR will greatly outsell its siblings. That may be supported by rumors, which have claimed that the device will account for 50 percent or more of Apple's production orders.



4 Comments

panoptician 11 Years · 65 comments

I mean, isn't it always an algorithm? I've seen people say that it somehow uses both cameras, but how do you stitch images of completely different focal lengths together? It's never made sense to me.

coolfactor 20 Years · 2342 comments

I mean, isn't it always an algorithm? I've seen people say that it somehow uses both cameras, but how do you stitch images of completely different focal lengths together? It's never made sense to me.

Imagine recognition is very good these days. Take two images and the phone can match them over top of each other perfectly. But in this case one is focusing on the background, one on the foreground.

paxman 17 Years · 4729 comments

I mean, isn't it always an algorithm? I've seen people say that it somehow uses both cameras, but how do you stitch images of completely different focal lengths together? It's never made sense to me.
Imagine recognition is very good these days. Take two images and the phone can match them over top of each other perfectly. But in this case one is focusing on the background, one on the foreground.

Exactly. That is basically the way I have been doing it in photoshop for a couple of decades :smiley: 

"But in this case one is focusing on the background, one on the foreground."  - And it can then apply some kind of luring filter on either image with a slider. I am not sure this is exactly how it will work, of course. If it is there will inevitable be some visible noise between the foreground and the background. For most uses I am sure it will be amazing.