Apple's custom Neural Engine in iPhone XS about 'letting nothing get in your way'
Apple's insistence on custom design of chips like the Neural Engine in the iPhone XS, XS Max, and XR is about unchaining the company's other designers, according to the lead of its chip architects.
"It's about owning the pieces that are critical and letting nothing get in your way," VP Tim Millet told Wired in an interview published on Tuesday. "The experiences we deliver through the phone are critically dependent on the chip."
Work on the first-generation Neural Engine, which appeared in the iPhone 8, 8 Plus, and X, reportedly began a few years ago with photography in mind. Engineers at the company thought iPhone cameras could be enhanced by machine learning, and some of the initial results included 2017's Portrait Lighting and Face ID technologies.
"We couldn't have done that [Face ID] properly without the Neural Engine," Millet said.
The second-generation Neural Engine in 2018 iPhones can run 5 trillion operations per second, and helps deliver more photo-related features such as the ability to control depth-of-field after a photo was taken, and better augmented reality. Apple is additionally opening up the chip to use by outside developers.
Most non-Apple smartphones use off-the-shelf chip designs from companies like Qualcomm. While those can be powerful and are steadily advancing, Apple's in-house design work has allowed it to build tight hardware/software integration and achieve features that would otherwise have to wait.
Apple has been designing custom chips since the A4 processor used in 2010's iPhone 4, following the takeover of PA Semi. Actual manufacturing was for some time handled by Samsung, but is now thought to be the exclusive domain of TSMC.
The use of custom designs has spread beyond central processors to things like the T2 chip that handles things like the Touch Bar and SSDs in Macs. Some third-party chips remain, like cellular and Wi-Fi.