While more than one rumor predicted that Apple would add a front-facing flash to its newest iPhone -- an increasingly common feature among high-end handsets --Â nobody considered that the company would do so in a way that alleviated the need to build in extra hardware, despite evidence that Apple has been exploring such a solution for years.
Retina Flash is how Apple has chosen to refer to a new feature that, put simply, turns the display on an iPhone 6s or 6s Plus into a giant flash. If you're a Mac owner, you've probably seen the same kind of effect before when taking a webcam snap in Photo Booth.
Apple says that the iPhone's Retina Flash has several key differences from the Photo Booth incarnation, though.
To begin with, the iPhone's display goes to 11 when it's in Retina Flash mode. Apple touted a custom display driver during Wednesday's keynote that can jack the display's brightness up three times higher than it normally operates at.
Like the rear-facing True Tone flash, the Retina Flash also analyzes the ambient lighting situation --Â utilizing a preflash, in this case -Â before choosing a customized flash tone. Apple actually uses the True Tone name in reference to this effect as well, somewhat confusingly.
While Apple didn't reveal anything else about the internal secrets of the Retina Flash on Wednesday, a patent awarded to the company in 2012 sheds some more...light...on the subject.
First filed in 2010, the patent --Â with the imaginative title "Image Capture Using Display Device As Light Source" --Â describes several methods of analyzing the scene, configuring the flash, and finally capturing an image.
Apple contemplates using both a hardware light sensor and analysis of the pixel intensity from the camera feed to determine lightning conditions, then dynamically adjusting the duration, brightness, and color of the display to simulate multiple types of flash. It could be used as a photo flash to compensate for darkness, for instance, or as a fill flash to simply lighten shadows.
Prefacing the True Tone model, Apple also discusses ways of using the flash for color correction:
For example, the screen can be set to various shades of white or to another color (e.g., pink, yellow, etc.) which can affect skin tones. In some implementations, the color balance of the image can be determined, for example, by computing a histogram that represents the color distribution of the image to be captured. The color balance can be corrected by changing the color of the screen to a complementary color prior to flash. For example, if the scene is too blue, the screen color can be changed to a yellow or pink tint to compensate for the blue depending on the color balance the user is trying to capture.
Another interesting method described in the patent is a way to essentially protect autofocus from itself. The company worries that some camera modules may accidentally adjust their sensitivity to the flash, making the resulting photos too dark.
To combat this, Apple divides the flash time into three intervals --Â known as rise time, sustain time, and fall time --Â which describe the on and off cycle of the flash. Rise time is the time it takes the flash to reach peak intensity; sustain time is the period during which peak brightness is sustained; and fall time is the time it takes for the flash to turn itself off and ambient light levels to return to normal.
Image capture is then precisely timed to fall within the sustain time, but early enough that the camera does not have the chance to adjust to the flash. Apple also takes into account the camera's frame latency --Â the time it takes an image to enter the camera lens, be processed, and then shown on the display --Â to ensure the frame captured is the frame that's on screen at the time the user presses the shutter button.
All in all, what appears on the surface to be a simple feature -- flash the screen to make the room brighter -- seems to be another example of Apple's extraordinary attention to detail and drive for perfection.