Apple is again rumored to enable third-party AI interactions via Siri in iOS 27. It would be a further expansion of customization and control on top of the improved Apple Foundation Models expected in 2026.

Apple Intelligence already uses ChatGPT for certain requests, allowing it to handle complex tasks without relying solely on Apple models. It is a distinct system that requires user permission and exists entirely outside of the on-device and Private Cloud Compute options driven by Apple Foundation Models.

A returning rumor shared via Bloomberg suggests that Apple will continue to rely upon third-party AI systems, but this time through installed apps. This is yet another completely separate system that would involve APIs targeting the third-party AI tools within apps available in iOS 27.

The partnership with Google Gemini is another thing entirely, where Apple is using distilled models in helping retrain Apple Foundation Models. All of this shows that Apple isn't against working with third-parties when it comes to AI tools.

AI tools via app APIs

A June 30, 2025 rumor said Apple was in talks with Anthropic and OpenAI about powering a new version of Siri. More recent reporting shows Google's Gemini is being used in rebuilding Apple Foundation Models instead, but that wasn't the end of third-party involvement in Apple Intelligence.

If the latest rumor is true, Anthropic's Claude could be among the third-party AI tools accessed via the new app APIs. Basically, a user could install the Claude app and designate it for certain tasks, like ones related to coding.

It isn't clear if Apple will continue to offer ChatGPT directly via Siri if this new system is put in place. It may be seen as redundant, as theoretically, the new Apple Foundation Models trained by Gemini will be more than capable without that escape hatch.

OpenAI would then be able to reintroduce ChatGPT to Siri via the app APIs just like everyone else. It not only creates a more equal playing field for third-party AI tools, but gives users more choice when it comes to some functions.

Why opening Siri is the practical move

Support for coding agents from Anthropic and OpenAI in Xcode reinforces the idea that Apple isn't trying to build every layer itself. Instead, it wants to decide how those tools are presented and controlled within its ecosystem, and that can be done with APIs.

Apple has consistently excelled at integrating hardware, software, and services into a cohesive experience. The bigger challenge is keeping Siri competitive as AI models improve and specialize in different types of tasks.

Opening Siri to third-party AI tools allows Apple to expand capabilities without forcing a single system to handle every request. Users can choose or assign certain tasks to outside AI services through installed apps, while Apple maintains control over how those interactions are managed.

However, the default system that would be presented to the user is the one totally owned and operated by Apple Foundation Models. The on-device models and Private Cloud Compute systems may be enough for most, but the option of tying into third-party AI will be a boon for users.

The reported iOS 27 plan could always change before release, since Apple often adjusts features, partners, and timing as development continues. As a case in point, this feature was rumored in 2025 to arrive in what became iOS 26.

Even so, the direction remains consistent. Each new report reinforces rumors about Apple's potential move to give users more control over how they use AI and Siri.