Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple A-Z » Apple Initiatives

Apple AI

Apple AI

A significant fad has taken over Silicon Valley -- Artificial Intelligence. Apple has avoided the loaded term for the most part, but machine learning can be found all over its products. Many think Apple is behind in machine intelligence, but we're here to tell you Apple AI is already here.

Quick Links
Siri
Apple Car
Apple Silicon
Privacy
John Giannandrea

Page last updated:


Get Apple News Directly in Your Inbox


Apple has been avoiding the term "AI" or Artificial Intelligence until recently, with executives using the term in more interviews and press releases. Even the updated M3 MacBook Air had a section detailing how it powers AI applications, and the 13-inch iPad Pro release was filled with AI references.

Apple has been working on machine learning tools for its devices for years, an early example being how its touchscreen keyboard worked on iPhone. It used algorithms to determine what letter a person meant to hit when typing and would learn based on each user.

That keyboard feature evolved into an advanced predictive autocorrect system that now uses a transformer language model in iOS 17. Apple executives have even referred to the new autocorrect as AI in interviews when pressed for examples of such technology on its platforms.

The real test of Apple's dedication to AI will be WWDC 2024. As we get closer to the June 10 keynote, signs are mounting of Apple's work in AI, including public research papers covering various technologies. Leaks surrounding iOS 18 and macOS 15 point to AI-filled releases.

A note on terminology and Apple AI

We will refer to Apple's advanced machine learning efforts as "AI" or Artificial Intelligence to keep things simple. Apple hasn't refered to this term in its operating systems, services, or feature descriptions until early 2024, but it is used by the general public when discussing computer-generated content.

Apple tends to avoid using industry terms to distance its technologies from competitors. For example, AR and VR were not mentioned during the Apple Vision Pro demo. Instead, Apple called it Spatial Computing.

It isn't clear if Apple will continue to lean into AI or come up with its own term, but for now it uses AI when discussing product features relating to M-series Neural Engines. Once Apple has its own branded AI systems, the name could change.

Apple AI is used as a catch-all term in this piece for anything related to machine learning, transformer technology, generative AI, and other systems being developed by the company. The term is not official by any stretch.

Apple's rumored AI strategy

As WWDC gets closer, rumors and public white papers have begun painting a potential picture of what Apple's AI strategy could be. It could take on a similar business model to Apple and search, or perhaps a whole new App Store.

Marketing image for Google Gemini with software shown on Android smartphones Apple is allegedly in talks to license LLMs like Gemini

Multiple reliable sources suggest Apple is in talks with Google to license its LLM called Gemini for iPhone. It would be like Google Search is now and act as a default outward-looking connection to a service Apple doesn't provide.

At the same time, it seems Apple is working on its own local models, one called MM1, which can be installed on the iPhone and process data privately. This tool and others would work similar to Apple's Spotlight, a local search tool that calls out to Google only when necessary.

If Apple chooses to avoid building an LLM, which requires massive data troves and servers, it would allow the company to focus on vertical integration and smaller more private models that benefit users and developers.

An alternate path that could explain Apple's talks with Google could be a new AI App Store. It would be a storefront specific to shopping for AI apps and experiences that can be installed on the iPhone like an app.

Siri Intelligence and AppleGPT

Siri is one of Apple's original machine learning models. It was acquired by Apple when it purchased the company SRI.

In its most basic form, Siri is a decision tree that translates verbal commands into yes/no variables until it reaches a solution. At least, that's how it generally operated at its inception.

Siri is an early example of Apple's ML push Siri is an early example of Apple's ML push

Siri is much more advanced today and can determine information on a more general scale. Different command sequences can trigger various events, but there is room for improvement, given how quickly competitors like chatGPT and Bard have grown.

Apple began referring to something called "Siri Intelligence" as a catch-all term for many machine learning algorithms occurring on devices. It is what surfaces calendar events from an email or suggests which apps to use based on the time of day.

This may seem rudimentary by today's standards, but it was quite the feature when it first rolled out. Now, Apple treats Siri as part of the operating system instead of just a virtual assistant.

It isn't clear what Apple has planned for Siri. An internal team was rumored to be working on a more intelligent voice assistant for Apple Vision Pro, but it was allegedly squashed in favor of Siri.

Apple has AI projects around the world, not just its headquarters Apple has AI projects around the world, not just its headquarters

Another rumor states that Apple is testing chatGPT-like technology and is even being referred to as AppleGPT internally. While such a tool may never reach the public, it may be the first step to a real AppleGPT or smarter Siri.

Rumors of a secret Swiss "Zurich Vision Lab" suggest Apple has a dedicated facility for developing AI, ML, computer vision, and more advanced Siri models. It is likely what developed Apple Vision Pro and the technology surrounding it.

In February 2023, Apple held its first annual AI summit to discuss advances in AI and ML — a clear indication of Apple's dedication to the field. However, it appears the first summit was just a team-building exercise.

A new Siri enhanced by AI is expected to be revealed as soon as WWDC 2024. Rumors about several AI features in iOS 18 have been ramping up in recent months.

Apple Ask

A rumor suggests Apple is testing a tool internally called "Ask." This tool is being used by select members of the Apple Support team to generate specific answers to user queries.

An image with the Siri icon, Messages icon, Safari icon, and Spotlight search bar Siri, Spotlight, Safari, and Messages could get AI upgrades

Not much is known about the tool other than it being some form of generative tool that relies on information found in Apple's knowledge database. It may be more advanced than a standard LLM since database information is continuously in flux as it is updated.

Ask could remain an internal tool, but it seems Apple may be using the internal testing to train a public version. It would appear as an advanced search engine to the user, perhaps within Spotlight, Siri, or both.

One application of AI for Messages could be custom emoji. Users could describe an emoji and have a representation pop out for use as a sticker or reaction in a conversation.

Apple Silicon and the Neural Engine

Another clear indicator of work on Apple AI is Apple's Neural Engine. It first appeared on the iPhone's A11 Bionic processor and was used to process things like Face ID and camera actions.

The Neural Engine is present on all modern Apple Silicon The Neural Engine is present on all modern Apple Silicon

The first Neural Engine was capable of performing 600 billion operations per second. That has since expanded to 31.6 trillion operations per second in the M2 Ultra.

The Neural Engine is required to process large amounts of data instantly and enables advanced photography algorithms and features like Portrait Mode.

The advancements of Apple's Neural Engine and the introduction of the M1 processor proved that Intel just wasn't going to cut it anymore. Apple AI processes can operate significantly faster on Apple Silicon thanks to the Neural Engine.

John Giannandrea

One of the few places you can find a mention of AI on Apple's website is in John Giannandrea's job description. He's Apple's Senior Vice President of Machine Learning and AI Strategy.

SVP of ML and AI Strategy SVP of ML and AI Strategy

He reports directly to Apple CEO Tim Cook and has worked at the company since 2018. His bio says he oversees strategy for AI and ML across the company and the development of Core ML and Siri.

"John hit the ground running at Apple and we are thrilled to have him as part of our executive team," Cook said after hiring Giannandrea. "Machine learning and AI are important to Apple's future as they are fundamentally changing the way people interact with technology, and already helping our customers live better lives. We're fortunate to have John, a leader in the AI industry, driving our efforts in this critical area."

Apple didn't hire Giannandrea by mistake — his eight years at Google were spent developing advanced AI systems. And one of his foundational principles for developing Apple AI is privacy.

"I understand this perception of bigger models in data centers somehow more accurate," he said in an interview, "but it's actually wrong. It's better to run the model close to the data rather than moving the data around."

That has led to Apple's push for more private on-device actions.

Privacy as a service

OpenAI's chatGPT, Google's Bard, and Facebook's Llama all have one thing in common — massive data stores built by collecting user data. This strategy has led these products to have an incredible understanding of people and how they think right out of the gate.

Apple heavily emphasizes privacy in its advertising Apple heavily emphasizes privacy in its advertising

Apple's approach to developing with privacy in mind is seen as a hindrance for AI rather than a boon. The consensus seems to be that more data means better algorithms.

However, Apple is ahead of the curve in silicon development and on-device training. Its restrictive data sets seem to have forced the company to think differently about how to train Apple AI, which could lead to a better model.

Apple's competitors suck in every piece of information on the internet to feed their models, leading to a phenomenon called model collapse. That means these large language models are actually losing their "intelligence" the more they feed on their own output.

The so-called Apple AI likely won't suffer from model collapse because it won't have access to all the data on earth, including its own outputs. That means a healthier algorithm and less interference from non-human data.

Apple will likely have complete control over whatever feeds its models.

The future of Apple AI

Stories have cropped up in recent years about how Apple has fallen behind in developing AI. Some even suggest Apple hasn't even started, given its lack of a public chatbot.

There are mountains of algorithms in use for Apple Vision Pro There are mountains of algorithms in use for Apple Vision Pro

However, all evidence points to the contrary. Apple has been at the forefront of machine learning and other computer intelligence models for the past decade or more.

What's next for Apple AI is anyone's guess. It doesn't seem likely that Siri will become a chatbot that hallucinates when asked pointed questions, but that doesn't mean Apple isn't going to use LLM, GPT, or other tools.

Apple didn't utter the phrase "AI" once during WWDC, opposite Google, which used the term over 100 times during I/O. But Apple AI was all over that developer conference, especially in visionOS.

Even iOS 17 has a key piece of technology being used to develop these popular chatbots — a transformer language model. It's being used for autocorrect, which has proven spectacular in the betas.

Apple Car

Apple's worst-kept secret is Project Titan, an autonomous vehicle project. While we've taken to calling this secret project Apple Car, it isn't clear if Apple will actually ever release a vehicle.

Internally, the project has been put on indefinite hiatus. However, everything Apple learned about autonomous driving has applications in artificial intelligence.

Apple AI will be needed to run Apple Car Apple AI will be needed to run Apple Car

Everything Apple originally developed around machine intelligence seemed to be leading to an Apple Car. A lot of technology introduced for Apple Vision Pro would work great on a vehicle after being scaled up.

Getting these systems to coordinate in real-time, detect objects, understand user commands, and generate feedback all at once is a job for an advanced Neural Engine. A report in 2021 suggested that Apple had completed such a chip and would begin testing it.

While Apple's hope to develop a self-driving vehicle may be in the rearview mirror for now, all of the work to create advanced computational systems wasn't for nothing. Each bit of progress on one product aids the progress of another.

It seems all roads lead to Apple AI.