Apple secretly acquired a camera startup and has already used its AR technology
Sometime between 2018 and 2019, Apple quietly acquired an Israeli computer vision company — and has already adapted its developments into current products.
Sometime between 2018 and 2019, Apple quietly acquired an Israeli computer vision company — and has already adapted its developments into current products.
In a wide-ranging interview, Apple's head of Artificial Intelligence John Giannandrea states that it's "technically wrong" for AI to be processed at data centers instead of on a user's own device.
Apple is updating its Vision framework in iOS 14 and macOS Big Sur to allow developers to detect body and hand poses or gestures within their apps.
Adding to a mounting number of artificial intelligence-related acquisitions, Apple in the past few weeks purchased Ontario-based Inductiv to work on Siri and machine learning initiatives.
Apple's new iPhone SE is the company's first — and thus far, only — iPhone to solely rely on machine learning for Portrait Mode depth estimation.
Pixelmator Photo for iPad has been updated with support for trackpads and other input devices in iOS 13.4, plus new machine learning-powered features that you can use to match colors between two photos.
Speech recognition systems from major tech companies have a harder time understanding words spoken by black people than the same ones spoken by whites, a new study finds.
Senior Artificial Intelligence executives from technology firms, including Apple, are in Brussels to make their case as the European Union aims to set regulations on artificial intelligence that could drastically affect machine learning globally.
Apple Maps may be able to provide its users with more accurate information about their location in the future, by using artificial intelligence to adjust GPS data when incorrect information or mistakes are detected in sensor readings.
Apple has announced that it will again be appearing at and sponsoring the NeurIPS conference on Machine Learning, from December 8 to December 14.
Apple is including a computational photography feature called 'Deep Fusion' in the iOS 13.2 beta, which can help produce highly detailed images from the iPhone 11 and iPhone 11 Pro cameras. AppleInsider explains how the groundbreaking feature functions.
Apple has offered details of an internal development tool titled "Overton," a system for monitoring and improving machine learning applications such as how Siri determines results for queries, by handling the lower-level tasks and allowing engineers to focus more on higher-level concepts.
Apple may be in on the early phases of designing small over-the-ear headphones which can detect which ear the device is in, to tailor the audio environment.
Apple's subtly flattering new FaceTime feature in iOS 13 beta 3 corrects the appearance of your attention so that you appear focused on your caller — as if perfectly staring at the camera — even when you're looking at the screen. The magic behind it has incrementally developed across years of evolving software and hardware advancements, offering some interesting insight into how Apple uniquely charts out the future with its products.
In perhaps the cutest application of machine learning yet, Apple has included algorithms in its latest computer vision framework to enable iPhone and iPad apps to detect your furry friends in images.
Apple recently poached Ian Goodfellow, a noted artificial intelligence expert, away from Google as part of efforts to build out a quickly growing team focused on the development of AI and machine learning technologies.
Apple co-founder Steve Wozniak is scheduled to speak at Purdue University on Apr. 17, with the theme "What IF we lose control of technology?"
Apple on Wednesday confirmed its recent takeover of Laserlike, a Silicon Valley startup that applied machine learning to content discovery.
Apple has announced that John Giannandrea has been named to the company's executive team as senior vice president of Machine Learning and Artificial Intelligence Strategy.
Apple on Monday updated its Machine Learning Journal with a post by the company's Siri speech and audio software engineering teams, explaining how the company uses machine learning to help the HomePod hear people under tougher circumstances than iPhones and iPads.
{{ summary }}