Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Portrait mode's 'bokeh' was a risky and massive quest for perfection

iPhone 12 Pro camera bump

Last updated

The Portrait mode in Apple's iPhone lineup was difficult to pull off, and the move into computational photography prompted managerial debate and some level of professional risk to those involved.

The majority of iPhone users will be familiar with the Portrait mode on their devices, with the camera able to take a photograph of a subject in sharp focus, while blurring out the foreground and background elements. In the Harvard Business Review profile on Apple's management of innovation, the journey to achieve "bokeh" was a tricky one to undertake.

It was the kind of innovative technological move that Apple is renowned for. That renown, though, comes as much from how Apple manages its people, as it does from the technology experts it always employs.

In 2009, senior leader Paul Hubel had the idea of enabling iPhone users to take portrait-style photographs that include bokeh. A Japanese word that refers to purposefully out-of-focus elements in a photograph, bokeh is typically employed to create a blurry and pleasing background, allowing a viewer to pay attention to the in-focus subject.

Typically shots that use bokeh were limited to high-priced cameras, such as single-lens reflex cameras that could manage the focal range of the lens with minute detail, with smartphone camera systems typically not able to work to such levels.

Hubel reckoned that a dual-lens camera design could take advantage of computational photography to create a composite image, one that mimicked the bokeh of SLR cameras. The concept was welcomed by Apple's camera team, which had the purpose to encourage "More people taking better images more of the time."

Though initial efforts were promising, "failure cases" were frustrating the team, such as when the algorithm failed to determine the line where the main subject ended in the image and where the blurry background began. In cases where some foreground elements were near to the subject but were intended to be blurred, the algorithm could mistakenly blur some parts of these extra parts, but keep others in sharp focus.

One person involved in development said that Hubel was "out over his skis" during much of the production of the feature. Specifically, this meant that if the feature didn't work, or didn't work well, Hubel's team was in danger of losing credibility and weight with the rest of the iPhone development effort — and Apple's management itself.

As Apple has strict engineering standards of zero artifacts, equating to "no unintended alteration in data introduced in a digital process by an involved technique and/or technology," these so-called "corner cases" prompted multiple "tough discussions" between Apple's teams. Such corner cases forced the feature to be delayed a year to allow time for fixes.

An example of bokeh in Portrait mode on the iPhone XR An example of bokeh in Portrait mode on the iPhone XR

Design and marketing leaders were invited by engineering teams to agree on quality standards that could affect the feature, which also raised the question of "What makes a beautiful portrait?" To create the quality standards, great works of photographers were analyzed for themes, such as how the edge of the face blurred while eyes remained sharp, which were then introduced to the algorithm.

Another problem was raised in previewing a portrait photo, as it was originally designed to be applied to an image after it was taken. The Human Interface design team insisted there be some form of "live preview" before a shot was taken, to assist the user in making adjustments.

Working with the video engineering team responsible for sensor control and camera operations enabled the camera hardware team to push forward and create the addition. It was then introduced to the public with the iPhone 7 Plus, and formed a central pillar to Apple's marketing efforts.

The modern version of the Portrait mode in the camera app can, in some cases, rely heavily on machine learning. For single-camera iPhones like the iPhone XR and the iPhone SE, computational photography advancements have allowed the feature to continue working despite the reduced number of source images.



9 Comments

randominternetperson 8 Years · 3101 comments


An example of bokeh in Portrait mode on the iPhone XR
An example of bokeh in Portrait mode on the iPhone XR

That brown shape in the top center of that image is exactly what can go wrong.  The picture is great until you get distracted by that not-blurred-enough thing that looks like a borg implant on the far side of the subject's head.  An SLR with a large aperture and shallow depth of field would have handled that better.  I expect the development team would see this as an unfortunate edge case where their AI failed.

mac_dog 16 Years · 1084 comments


An example of bokeh in Portrait mode on the iPhone XR
An example of bokeh in Portrait mode on the iPhone XR
That brown shape in the top center of that image is exactly what can go wrong. 

The camera works fine in the image. It’s just a crap composition. 

SpamSandwich 19 Years · 32917 comments


An example of bokeh in Portrait mode on the iPhone XR
An example of bokeh in Portrait mode on the iPhone XR
That brown shape in the top center of that image is exactly what can go wrong.  The picture is great until you get distracted by that not-blurred-enough thing that looks like a borg implant on the far side of the subject's head.  An SLR with a large aperture and shallow depth of field would have handled that better.  I expect the development team would see this as an unfortunate edge case where their AI failed.

Once all iPhones and iPads have LIDAR standard, it will be very useful for assisting the AI to determine what is actually foreground and background.

scartart 17 Years · 201 comments

mac_dog said:

An example of bokeh in Portrait mode on the iPhone XR
An example of bokeh in Portrait mode on the iPhone XR
That brown shape in the top center of that image is exactly what can go wrong. 
The camera works fine in the image. It’s just a crap composition. 

Whilst it could be improved you don't always get the opportunity in street photography to get a perfect composition. If you had camera capable of providing a shallow depth of field and good bokeh it wouldn't be an issue. Clearly the AI has failed to achieve what is possible with better quality equipment.

SpamSandwich 19 Years · 32917 comments

scartart said:
mac_dog said:

An example of bokeh in Portrait mode on the iPhone XR
An example of bokeh in Portrait mode on the iPhone XR
That brown shape in the top center of that image is exactly what can go wrong. 
The camera works fine in the image. It’s just a crap composition. 
Whilst it could be improved you don't always get the opportunity in street photography to get a perfect composition. If you had camera capable of providing a shallow depth of field and good bokeh it wouldn't be an issue. Clearly the AI has failed to achieve what is possible with better quality equipment.

Either a stereo view or LIDAR will solve the depth problem for the AI.