Apple's director of AI research, Ruslan Salakhutdinov, gave peers a small glimpse into the company's self-driving platform this week, discussing some internal projects at the NIPS machine learning conference.
While one of the projects — LiDAR object detection — was detailed in a November research paper, Salakhutdinov also went into previously unpublicized areas, according to Wired. The company's camera-based recognition system, for instance, can discern objects even when lenses are obscured by rain, and identify pedestrians on the side of the road even when they're partially hidden by parked vehicles.
"If you asked me five years ago, I would be very skeptical of saying, 'Yes you could do that,'" Salakhutdinov commented.
The director also talked up Apple's work on dynamic decision making by cars — such as how to avoid a pedestrian — and its use of "SLAM," simultaneous localization and mapping, a technology some autonomous machines employ to maintain a sense of direction.
Apple was further said to be creating 3D maps of cities, including details like traffic lights and road markings. Some of this data is presumably being collected by Apple's autonomous test vehicles, but still more could be coming from the Apple Maps vehicles touring cities around the world, which the company has yet to fully explain.
The ultimate goal of Apple's efforts is unknown, but may involve a platform for ride-hailing services. Before then it should begin running its internal "PAIL" (Palo Alto to Infinite Loop) shuttle.
13 Comments