Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple should adopt these Pixel 4 features for iPhone 12

The Google Pixel 4

Last updated

There are a lot of exciting features in the Google Pixel 4. Even if the entire Pixel 4 package is flawed, there are still features that it has, that we like so much that we hope Apple adopts them for the next generation iPhone.

Bypass lock screen

As we've used the Pixel 4, there is one feature that we didn't much appreciate until we switched back to our iPhone. That is bypassing the lock screen when unlocked.

The iPhone will unlock your phone as fast as possible, but it doesn't progress to your home screen until you swipe up from the bottom after the unlock. The Pixel 4 will take you right to the home screen, no swipe necessary, as soon as it is authenticated.

Pixel 4 does have the option to disable this, but for people who like it — like us — it makes the user experience faster and more seamless. We'd like it if iPhone could do this, as the swiping gesture becomes tedious and slow when it could just be on the home screen straight away.

Matte finish sides

Starting small, both iPhone 11 and Pixel 4 have various models with glossy or matter back options, but only Pixel 4 comes with matte sides. These sides feel great in the hand, and don't as readily show fingerprints as the gloss in the iPhone.

The matte finish of the Google Pixel 4 The matte finish of the Google Pixel 4

If you go caseless, the matte finishes are great to hold, and keep sufficient grip. It would also mesh nicely with the matte black options for the iPhone 11 Pro and iPhone 11 Pro Max.

Of course, we've seen this before in the iPhone 7. We'd just like to see it again soon.

Motion Sense

Motion Sense is a new feature for the Pixel 4 that is based on the 3D-sensing radar implanted into the front of the phone. Using that low-powered radar, the Pixel 4 can detect when your arm approaches the phone or when you walk away from your device.

It has some gimmicky applications such as waving an arm in the air to skip a song. This may need a "killer app" to really take off beyond system integrations.

But as-is, Motion Sense can do some truly great things. It will turn off the display when you walk away from the phone. It will silence an alarm or a ring tone as you reach for the device.

Motion Sense starting the unlock process before the Pixel 4 is even touched Motion Sense starting the unlock process before the Pixel 4 is even touched

It also kicks in the face scanning to unlock your phone as soon as you reach for the phone, giving it an advantage in speed from locked to unlocked in most cases. If the iPhone were to do this, Face ID would be far faster than the new Pixel 4.

Apple does some form of this by using your eyes to detect when you're looking at your iPhone, and the True Depth Camera system already can detect depth.

But, the upgrade with the radar would be nice on the iPhone.

Onboard assistant

Starting with Pixel 4, queries to Google Assistant are largely processed locally on the phone. Google Assistant can translate your voice to text and, if it is something that can be done locally, it is nearly instant without that call out to the internet for information.

As an example, if you just want Siri to open the Notes app, you say "Hey Siri, open the Notes app" which is then sent to the cloud, transcribed, sent to the cloud for processing before the result is carried out on your phone. It happens quickly, but there is still minimal delay and reliance on internet connection.

Google Assistant can transcribe locally Google Assistant can transcribe locally

On Pixel 4, it is transcribed right on your phone, processed, and can open the Notes app instantly without waiting for anything to get sent to the cloud. And, it does most of it without any need for internet connection.

In use, not having to rely on an internet connection for simple, local tasks such as opening apps, setting alarms, or dictation would be quite a bit faster and a welcomed change. This was the original promise of Siri, after all.

Higher-resolution tele lens

The iPhone sports an additional ultra-wide lens over Pixel 4, but Google upped the resolution on its new telephoto lens to 16MP, above the 12MP on the iPhone 11 Pro and iPhone 11 Pro Max.

8X tele photo shots on the iPhone 11 Pro and Google Pixel 4 8X tele photo shots on the iPhone 11 Pro and Google Pixel 4

This makes a profound difference when shooting telephoto shots, especially when you zoom in above the default 2X. Jumping up to 8X or 10X zoom on the Pixel looks better than the iPhone hands-down.

On the "iPhone 12 Pro" or whatever it ends up being called, we want Apple to up the resolution and combine it with its computational photography chops to allow the creation of outstanding telephoto images.

90Hz refresh rate

Apple's OLED and LCD screens in the iPhone 11 Pro and iPhone 11 lines have a 60Hz refresh rate. This is good, but behind the 90Hz and 120Hz displays that are appearing on high-end phones.

90Hz refresh rate makes the phone feel more natural 90Hz refresh rate makes the phone feel more natural

Pixel 4 has a 90Hz display speeds up during fast motion then intelligently drops down to 60Hz for normal operation to save battery — but we're not sure that it's working right just yet, as battery life on the Pixel 4 is horrid.

There's a lot to consider with increasing the refresh rate. A display refreshing more often will draw more power, but that's not the only component that will. The system will be under greater load graphically, and that will be a power drain as well.

If the power situation can be solved, Apple could use the technology in the iPhone for smoother animations, better scrolling, and more immersive iOS experience. If recent rumors are to be believed, it seems this one already may be in the works.

Innovation?

Innovation isn't, and has never been, about being first to the market with something. You can ask Samsung about how they feel about being first with the Galaxy Gear watch, or the Galaxy Fold — it hasn't worked out that well for either product.

The iPhone wasn't first. It was innovative because it worked. It was innovative because it integrated all of these features in a coherent package, that was easy for the user to take advantage of.

We like the potential of the Pixel 4. It has a lot of features that could be fantastic. But, right now, they're poorly implemented, and the disparate integrations aren't doing the flagship Android phone, made by the primary Android developers, any favors. This should be the ultimate expression of Android, and yet, it isn't because the features on their own are half-baked and not ready for prime-time. If it is a song, it is one performed by 10 very talented students, with no conductor, and no sheet music.

And, these features in an iPhone would be a chorus with Apple's integration.



29 Comments

zoetmb 17 Years · 2655 comments

Lenses don't really have resolution in the common sense of the term, sensors do.   Lenses have resolution only in the sense of their ability to resolve a line.  And increased resolution in a very small sensor can cause inferior quality, because the photosites are smaller and when they heat up, they create noise, especially in low-light conditions (high ISOs).  

If you look at DSLR's, which have full frame sensors 20 times the size of a phone sensor, the highest-end cameras have lower resolution than the less expensive cameras right below them for this very reason.   They do that because it results in higher overall picture quality, especially in low light.   For example, in the case of Nikon, the highest end D5 has a 20.8MP sensor and the upcoming D6 is expected to have a 24MP sensor.   The D5 body was originally $6500 and is now selling for $5500.    The lower-priced D850 (originally $3300, now $2800) has a 45.7 MP sensor.   It's similar for Canon.   Sony's highest end mirrorless camera, the A9ii, has a 24.2MP sensor.   The lower-priced a7Riv has a 61MP sensor.   

tjwolf 12 Years · 423 comments

I completely agree that Siri should not require internet for basic things - how many times have I been hiking in the woods and tried to increase/decrease volume for my AirPods or change songs in my local library only to have Siri tell me it can't do that because there's no internet connection :-(

But using "transcribing" is kind of an odd choice for your argument - since that doesn't even require Siri.  The on-screen keyboard lets you dictate using voice and it's automatically transcribed into whatever app you're running.

sflocal 16 Years · 6138 comments

120hz screen refresh rate would be really cool, however, if I have to choose between great battery life but at 60hz or 120hz and horrible battery life then it's a no-brainer.

When it comes to Apple, I have faith it will work far better than Google's implementation.  

mike1 10 Years · 3437 comments

I've always wanted some of the basic Siri commands to be managed on the device rather than going out to the cloud. Examples include...
Opening any installed app
Calling a contact. I get that dictating a message might not be practical to do on the device.
Finding photos already labeled with names
Telling me what's on my calendar for the day or when my next meeting is.
Raise and lower volume
Set a timer