Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple engineers reveal the plan behind iPhone camera design philosophy

Rather than concentrating on any one hardware aspect of iPhone photography, Apple's engineers and managers aim to control how the company manages every step of taking a photo.

With the launch of the iPhone 12 Pro Max, Apple has introduced the largest camera sensor it has ever put in an iPhone. Yet rather than being there to "brag about," Apple says that it is part of a philosophy that sees camera designers working across every possible aspect from hardware to software.

Speaking to photography site PetaPixel, Francesca Sweet, product line manager for the iPhone, and Jon McCormack, vice president of camera software engineering, emphasized that they work across the whole design in order to simplify taking photos.

"As photographers, we tend to have to think a lot about things like ISO, subject motion, et cetera," Job McCormack said. "And Apple wants to take that away to allow people to stay in the moment, take a great photo, and get back to what they're doing."

"It's not as meaningful to us anymore to talk about one particular speed and feed of an image, or camera system," he continued. "We think about what the goal is, and the goal is not to have a bigger sensor that we can brag about."

"The goal is to ask how we can take more beautiful photos in more conditions that people are in," he said. "It was this thinking that brought about Deep Fusion, Night Mode, and temporal image signal processing."

Apple's overall aim, both McCormack and Sweet say, is to automatically "replicate as much as we can... what the photographer will [typically] do in post." So with Machine Learning, Apple's camera system breaks down an image into elements that it can then process.

"The background, foreground, eyes, lips, hair, skin, clothing, skies," lists McCormack. "We process all these independently like you would in [Adobe] Lightroom with a bunch of local adjustments. We adjust everything from exposure, contrast, and saturation, and combine them all together."

This isn't to deny the advantages of a bigger sensor, according to Sweet. "The new wide camera [of the iPhone 12 Pro Max], improved image fusion algorithms, make for lower noise and better detail."

"With the Pro Max we can extend that even further because the bigger sensor allows us to capture more light in less time, which makes for better motion freezing at night," she continued.

Apple's iPhone 12 range brings camera improvements across the board Apple's iPhone 12 range brings camera improvements across the board

Nonetheless, both Sweet and McCormack believe that it is vital how Apple designs and controls every element from lens to software.

"We don't tend to think of a single axis like 'if we go and do this kind of thing to hardware' then a magical thing will happen," said McCormack. "Since we design everything from the lens to the GPU and CPU, we actually get to have many more places that we can do innovation."

The iPhone 12 Pro Max is now available for pre-order. It ships from November 13.



53 Comments

melgross 20 Years · 33622 comments

That’s great, but I believe that most iPhone photographers, as opposed to iPhone snapshot shooters, would prefer the same sized sensors for each camera, as well as equal quality lenses, the wide already had higher quality because of a larger sensor and a better lens. Now, that disparity will get larger. I know that some will say that we take most of our pictures with that lens, and that’s why. Be that as it may, we don’t expect wide angle and tele lenses for our other cameras to be of lesser quality because we may not use them as much.

maybe someday, Apple will have a 2.5 to 3 times zoom for the wide to tele, or a 2 times for the super wide to wide. If so, they could eliminate one camera, and have room, and less expense to have both cameras with the same, large, sensor with IBIS for both.

tmay 11 Years · 6456 comments

melgross said:
That’s great, but I believe that most iPhone photographers, as opposed to iPhone snapshot shooters, would prefer the same sized sensors for each camera, as well as equal quality lenses, the wide already had higher quality because of a larger sensor and a better lens. Now, that disparity will get larger. I know that some will say that we take most of our pictures with that lens, and that’s why. Be that as it may, we don’t expect wide angle and tele lenses for our other cameras to be of lesser quality because we may not use them as much.

maybe someday, Apple will have a 2.5 to 3 times zoom for the wide to tele, or a 2 times for the super wide to wide. If so, they could eliminate one camera, and have room, and less expense to have both cameras with the same, large, sensor with IBIS for both.

Apple is rumored to increase some sensors size again next year, and perhaps add sensor shift to additional sensors. I doubt that the balance of optical zoom, sensor size, and aperture won't evolve, but telephoto lenses don't magically work with larger sensors without commensurate increase in diameter/height of the optical stack. It would appear to me that Apple has a roadmap of adding a folded optics telephoto in the future, which is available from some Android OS device makers, but likely not soon.

Another likely option, is that Apple will increase the resolution of its imagers, when the balance of realtime response, ie, hardware, allows it to, and that would provide higher quality zoom as well.

CurtisHight 8 Years · 65 comments

One “aspect” that would improve iPhone photography is to use square image sensors. This would: 1) Allow for the highest quality square image, 2) Allow the flash to be directly over the lenses for horizontal or vertical rectangles, as well as for a square, 3) Allow the possibility to capture horizontal or vertical rectangles without the need to align the camera to match. This would expand mounting options (mount vertical but shoot horizontal video) and offer ergonomic and usability options. The user could opt for a default of choosing format by rotating the camera.
Building on this, the camera could capture all three images: maximum square, horizontal, and vertical rectangles, and then allow the photographer to access any or all of them later in software.

To have the flash directly over each lens, the lenses would need to be in a line. This exposes a caveat in the ergonomic and usability options: The flash would only be directly above the lenses in either a vertical or horizontal hold, not both. But that is a small debit against a large deposit of advantages.

avon b7 20 Years · 8046 comments

A marketing exercise but completely understandable.

What they are saying is what most vertical manufacturers do (and have been doing for years). 

Apple is slowly catching up with photography but the harsh truth of the matter is that phone cameras are great and have been for years now. That won't change any time soon. Where phone cameras have branched out over the last two years is in versatility and Apple missed several opportunities to add that. Strangely so. 

tmay 11 Years · 6456 comments

avon b7 said:
A marketing exercise but completely understandable.

What they are saying is what most vertical manufacturers do (and have been doing for years). 

Apple is slowly catching up with photography but the harsh truth of the matter is that phone cameras are great and have been for years now. That won't change any time soon. Where phone cameras have branched out over the last two years is in versatility and Apple missed several opportunities to add that. Strangely so. 

Not strange at all, just a result of Apple's business model.

Apple sells enough of the current generation of iPhones each year, that it requires in the tens of millions to hundreds of millions each of a small set of four different cameras. Ramping to those volumes by suppliers is non trivial, hence why Android Os device makers are more than willing to pay premiums for smaller quantities, in the millions to tenss  of millions, for their flagships; they need those features to compete with other Android OS device makers, and Apple doesn't.

Apple will be shifting some 170/190 million iPhones 12 this fiscal year, each with the most powerful SOC in the industry, all to enable the most user friendly computational photography in the market. Apple has only "missed several opportunities", in your opinion,  because Apple's customer base doesn't buy features, per se, but rather, buys experiences, same as it ever has.

The metrics that prove this, are ASP, Margins, and revenues, all of which Apple leads, and frankly, it isn't all that bad at shifting units either.