Apple's iPhone 6 was already a great mobile camera, a fact the company touted globally in billboards featuring shots actual taken by the phone. With iPhone 6s, a faster, smarter processor and new 12MP image sensor deliver a sharper, more vibrant photography upgrade.
In testing the new camera features of iPhone 6s and 6s Plus compared to last year's 6 and 6 Plus, there wasn't always an obvious, incredible jump in image quality, as is ostensibly suggested by the "50 percent increase in pixels" of the new models' 12MP sensor compared to the previous 8MP shooter.
That's because sensor pixel density itself doesn't necessarily result in sharper, more accurate and better looking photos. In fact--as seen on other cameraphones that beat Apple to market with 12MP or even higher sensors--additional sensor density can make images noisy, thanks to the crosstalk that occurs between each sensor as they get smaller and packed even tighter on the sensor chip.
That explains why last year, DxOMark benchmarks for image quality ranked the 8MP iPhone 6 and 6 Plus higher than competing phones with much higher sensor specs, including the 13MP LG G2, the 16MP Samsung Galaxy S5, the 20.7MP Sony Xperia Z3, and even the 41MP sensors used by the Nokia 808 and Lumia 1020.
Source: DxOBench
For iPhone 6s, Apple is moving to a new 12MP sensor, but not just to maintain specification parity with Android and Windows Phone models boasting higher pixel count but not better photos.
The new sensor in iPhone 6s and 6s Plus retains the "Focus Pixels" of iPhone 6 (which are phase detection pixels on the sensor chip that give it fast autofocus) while adding "deep trench isolation" between sensor pixels to minimize crosstalk and the image noise and static-like pixelation that results from it.
Additionally, Apple's proprietary Image Signal Processor silicon (built into the A9 Application Processor) now has improved noise reduction logic to smooth out noise while retaining detail in edges and textures that are commonly blurred away by simpler imaging systems commonly used in competing cameraphones.
Colors that pop with increased accuracy
In some cases, photo comparisons I took between a 6 Plus and 6s Plus made the new phone's images look brighter, which might be interpreted as looking washed out. However, the color reproduction appears more accurate, and it looks like there is more detail being recorded.
These images (below) of Apple's Infinite Loop campus indicate a significant increase in image detail with less noise from iPhone 6s (top images taken with the already decent mobile camera on iPhone 6 Plus, followed by Apple's enhanced iPhone 6s Plus).
The difference in sharpness and noise grain is more evident as you examine detail, particularly here where the original images have been scaled down and compressed for publication, losing much of their detail.
Increased color and image detail allows iPhone 6s users more range in editing their photos afterward, such as increasing the saturation or adjusting light and dark details, in contrast to having the camera itself automatically "goose" every photo taken with canned effects that produce phony, oversaturated pictures that are intended to dazzle while throwing out real information in the process.
Apple noted that its goal for the iPhone's camera system is to capture accurate images. That stands in contrast to rival cameraphone makers, who often promote features that, for example, create composite photos that mix together several shots to create a "photo" where everyone is smiling, even though that image is actually fictional.
Better color and image detail is also evident in these shots driving through Grants Pass, Oregon. iPhone 6s image is on the bottom. In comparison, the top iPhone 6 Plus photo almost looks like it has a blurred layer of fog over it, even here where both images have been scaled down for publication.
Images compressed for publication
The image detail and color accuracy of these flowers is also apparent, and when zoomed in, you can see a broader range of detail and color information, giving you more room for editing afterward in a sharing app like Instagram.
Another example of added detail without pixelated noise or blurring can be seen in the edges of this brick building and its fire escapes and ornamentation, as well as its more accurate color.
Images compressed for publication
The increased detail and color range of photos taken by iPhone 6s can consume more storage space. In the photos above, the Apple Campus image is 1.6MB from the 6 Plus, but 2.1MB from the 6s Plus. The highway photo jumped from 2.1MB to 3MB on the 6s Plus. The flowers image was only slightly larger, growing from 1.7MB to 1.9MB, while the brick building jumped from 2MB to 2.7MB.
However, the significantly improved images make the difference in image size worth it, particularly as the price of iCloud storage and other forms of backups drop in price to where extra megabytes of data are essentially inconsequential.
One aspect that hasn't changed in the jump from 6 to 6s is the plastic lens assembly. While Apple has put a lot of engineering work into developing a compact, multiple lens system for iPhones, the nature of their size and materials results in common annoying artifacts.
This is particularly an issue when taking photos with bright light sources in the subject area, which can result in extraneous blue dots as seen below (a problem we noted last year in our review of iPhone 6 models). When the sun or other strong lighting hits the lens assembly from the side, it can also introduce lens flares or cloud the entire image with light refraction.
Until Apple targets iPhone lenses for a major update, the only way to deal with these problems is to change the angle of the camera; in some cases it can also be effective to use your hand to block offensive light from hitting the lens assembly from the side.
In addition to 12MP photos, iPhone 6s models also now enable the capture of 12MP Live Photos, 1080p Slomo and 4K video, all new moving formats we will explore in detail in a followup segment.