Suppliers of iPhone displays are reportedly hoping to persuade Apple to adopt micro lens array technology for the iPhone 16, but while it has key benefits, there are significant downsides.
Apple is very familiar with micro lens array (MLA) technology, having included it in an iPhone patent as long ago as 2013, and as recently as 2020. But it hasn't used MLA for the main iPhone display and now The Elec says that screen manufacturers have proposed that Apple should do exactly that for the 2024 iPhone 16.
Citing unnamed industry sources, the publication says that both Samsung and LG Display have proposed adding MLA to the iPhone's OLED screen. MLA works by taking light that is normally reflected inside the panel, and redirecting it toward the front of the display.
The clear benefit is that this can increase brightness on an OLED screen. But it also means that it takes less power to maintain a screen at regular brightness.
However, Apple is said to have yet to decide for or against the proposal because there are two issues. One is that although it increases brightness, using MLA reduces the viewing angle of a screen.
According to The Elec, this is the key issue that Apple is looking to suppliers to overcome, but there is reportedly also the fact that using MLA adds to the cost of the display.
While the iPhone 16 is expected in September 2024, Apple has said that it will announce the iPhone 15 range on September 12, 2023.
5 Comments
Interesting choice.
Saving battery is good, but at the expense of viewing angle.
A narrower angle would improve privacy in a crowded environment, but make it more difficult to use in more nonstandard situations where you are not able to hold it in the ideal position.
Too bad it's not switchable.
MLA is used on LG‘s oled G3 television for 2023 and I have not heard of any issue with side viewing on those panels.
And why would that be a problem?
If you ask me, the viewing angles should be no more than a few degrees, because it's me who needs to see the display not the one standing next to or in front of me...
Took me a few reads of that headline…
There is only so much light available in total from the screen, since the display elements still produce the same amount of light. The only thing a lens does is redirecting the wide spread light into a more narrow forward-directed cone. Hence, the “stray” light that used to escape from the display into the wide angles is now collected and pointed into the user’s eyes.
So, there isn’t really any “problem” with these displays to solve at wide angles. The lenses do exactly what you put them there to do. From a usecase perspective there might be a problem but not from a technical. The only way I see to solve the usecase problem is by removing the lenses — completely or partially.
One such solution is to make the lenses imperfect in the sense that they leak light to the sides. This can be achieved by tuning the reflectance of the lens materials, or by micro/nano structuring the lens. If somebody comes up with a “switchable” function for lenses such as MLAs that would also work (as suggested above). But in either of these usecase solutions, the perceived brightness would of course decrease in the forward viewing direction.
Lastly, there is an entirely different usecase solution — tracking the user’s eyes and dynamically redirecting the focal point of the lenses towards the viewer. This would be the winning design that would change the world of displays. Eye tracking is already in place, but MLAs that can be dynamically pointed in different directions are not (esp. fast enough and demanding little energy). In my view, this is where I would direct all my research if I was heading this project.
But who am I to know? I only designed a nanofilm you can put on an ordinary 2D display and turn it into 3D …some 20 years ago …and that didn’t take off.