iPhone X tech was originally meant to show up in 2018, Apple's Riccio says
Apple was originally expecting to ship the technologies in the iPhone X in 2018, hardware engineering head Dan Riccio revealed in an interview published this week.
The product's upcoming Nov. 3 launch was managed "with a lot of hard work, talent, grit, and determination," Riccio explained to Mashable. The executive admitted however that the desire to accelerate changes like an edge-to-edge screen left little time for alternatives if something didn't work out.
The company went "all in" when it decided to swap a home button and Touch ID for Face ID, Riccio noted, dismissing rumors that the iPhone X's exclusive use of Face ID was because it couldn't make an embedded touch sensor work.
"We spent no time looking at [putting] fingerprints on the back or through the glass or on the side," he remarked, adding that while schedules prevented that anyway, Apple executives also believed in the quality of Face ID.
"Quite frankly, this program was on such a fast track to be offered [and] enabled this year. We had to lock [the design] very, very early. We actually locked the design, to let you know, in November. We had to lock it early."
The decision to implement a neural engine in the phone's A11 Bionic processor reportedly traces back to 2014, when the earliest work on the chip began. The company didn't know what it would be used for, Riccio said, but realized the decision had to be made ahead of time.
Apple marketing chief Phil Schiller sugested that the idea of an edge-to-edge display dates back to the first-generation iPhone.
"We've had a dream since Day One to make it all screen, edge to edge," he claimed.
The OLED screen on the iPhone X is being produced by Samsung, Apple confirmed, but a custom component with additional software work to deal with issues like color accuracy. The company also noted that the phone's attention detection system simply checks whether anyone is looking, rather than automatically scanning for the registered Face ID user. Attention scans take place roughly every 30 seconds, and are used to decide whether or not to keep the screen on.