Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple looking at Augmented Reality to help ride-sharing app customers find their drivers

Uber's live location sharing feature

Apple is continuing to explore ways for augmented reality to be more useful, with one idea helping passengers of ride-hailing services, like Uber, to find their designated car, and to avoid climbing into the incorrect vehicle.

The expansion of ride-hailing apps has meant there are more people using the services, as well as more drivers on the road ferrying customers between locations. This brings up the problem for customers of identifying which vehicle is their ride, an issue that commonly occurs at times when there is a lot of traffic or parked cars, or even if multiple vehicles from the same service are at a popular pick-up point at the same time.

While the apps for the services do typically include maps to show where the user and the driver are in relation to each other using GPS coordinates, displaying a pin or a dot on the map is less useful in crowded environments. There is also the issue that not everyone can read maps effectively, limiting its usefulness.

A similar problem could be had for the drivers themselves, as a crowded pick-up location could make it harder to discover where the intended client is among the other people, and to drive closer to them. Short of having a description of the user's clothes or being advised of an extremely precise location of the customer, something not always possible with GPS, the task is equally tricky for drivers.

In a patent application published by the U.S. Patent and Trademark Office on Thursday, Apple's filing for an "Augmented reality interface for facilitating identification of arriving vehicle" aims to solve the problem using a combination of GPS and visual identifiers.

Image from an Apple patent application showing a passenger finding a driver via AR
Image from an Apple patent application showing a passenger finding a driver via AR

For Apple's solution, clients would be able to hold up their iPhone when they wish to identify the vehicle picking them up, moving their mobile device so the rear camera can pick up more of the environment, and eventually showing with an on-screen indicator which vehicle is correct. For drivers, it would work in a similar fashion, with what the system believes is the correct passenger highlighted on the display.

Starting from a base of knowing the location of both app users to ensure the two parties are in a similar location, it is suggested the customer's app would be provided details of the vehicle that will pick them up, such as the make, model, and color. The AR system would use the image data from the rear camera to identify vehicles, then to check the characteristics of the vehicles against the provided data to determine which one is most likely the correct car.

After being successfully identified, the app's AR view adds a highlight to the vehicle when it is seen by the camera, such as an arrow pointing to the car or a circle surrounding the vehicle on the road.

Image from an Apple patent application showing a ride-hailing app driver finding a passenger via AR
Image from an Apple patent application showing a ride-hailing app driver finding a passenger via AR

A similar system would be employed for drivers to identify passengers. As the passenger is likely to change clothes regularly, a database of characteristics that are unlikely to vary is created, such as height, perceived weight, facial hair, hair color, and a photograph, which are then used by the app to narrow down the members of the public to just the customer.

As appearances can change, the backup for this style of identification is for the user to advise to the app in advance of other information in their environment, like nearby landmarks, or other obvious details about their appearance.

When within a predetermined range of each other, the app can signal the driver to raise their mobile device and use the rear camera to scan the environment, or if it is already holstered in a suitable position, automatically enable a scanning mode. Facial recognition and systems for identifying clothing, local landmarks, and other elements based on the provided data are then used to determine the customer, and the viewed image on the screen is altered to highlight the likely client.

While the patent application mostly deals with ride-hailing services, it also suggests alternative uses. For bus passengers, the system could be used to identify the correct bus from multiple buses at a bus stop or station, or trains at a station.

The publication of a patent or patent application isn't a guarantee that the concepts described will actually make it into a future product or service, but does indicate areas of interest for Apple.

Apple has put a considerable amount of effort and resources into augmented reality, including the creation of ARKit to simplify the process for developers to add AR content to their apps. The company has also performed a number of acquisitions and hirings in the field, including purchasing AR headset lens maker Akona Holographics and bringing aboard VR art app developer Sterling Crispin as a prototyping researcher.

To demonstrate AR's usefulness, Apple introduced the Measure app in iOS 12, which uses ARKit to measure the dimensions of items using the iPhone's rear camera. Another feature, AR Quicklook, is able to quickly place AR objects from an online store into a scene, which could help online shoppers decide whether or not to make a purchase.

Rumors surrounding Apple smart glasses and AR headsets have also continued through the year, including one suggesting a headset could employ WiGig and use 8K-resolution eyepieces. As for when an Apple headset or smart glasses could launch, some analysts suggest it could arrive as soon as 2021.