Insider sources have confirmed what everyone already knew — Apple Foundation Models trained by Google Gemini will have LLM features seen in other AI tools.

As artificial intelligence spreads into more available chatbots and software features, consumers have become familiar with what basic features these tools should provide. Even though Apple has already shared what its personalized Siri and Apple Intelligence features would do, one report felt the need to source internal leakers to confirm the obvious.

According to a story from The Information you'd have to pay hundreds to read, Apple's newly trained-by-Gemini models will have functions like the ability to answer factual questions, which has been rumored since September, or be proactive about data found in Calendar, which was part of Apple's original Apple Intelligence announcement. The only new tidbit that might hold any interest to our readers is the potential timing of some features, but even that seems confused.

Reliable sources have pointed to a spring 2026 timeframe for the personalized Siri and Apple Intelligence launch, which was delayed in 2025. There is no doubt that Apple has more planned for WWDC 2026, and the report says insiders believe they know which features will be revealed in June.

Siri's ability to remember past conversations, more similar to a chatbot, won't come until iOS 27 and be revealed during WWDC. There's also the mention of Apple Intelligence warning users to leave sooner to avoid traffic for an airport pickup found in the Calendar app.

While marketing head Greg Jozwiak says Apple doesn't want Siri to be a chatbot, adding the ability to recall past conversations to the smart assistant would be a good move, so this insider info isn't surprising. The more proactive notifications around Calendar should sound familiar too, because it is a natural evolution of existing features in Calendar.

If a user manually sets up a "time to leave" notification on a Calendar event, the app uses Apple Maps data and the user's optimal route to determine if traffic is heavy or light, and warns the user when the best time to leave is. It seems the new feature would go one step further and eliminate the need to manually set the time to leave notification.

Other expected features include a more conversational Siri that can react better to queries, answer with more knowledge instead of simply providing a link, and emotional support where required. For example, if a user shares that they are lonely or need help, Siri can react on a more emotionally apt level.

Siri will still be able to control the user's home, make calls, set reminders, and the other classic features. However, some of these experiences will be enhanced by Siri's ability to parse queries better, like if a user has an incomplete Contacts database, it can infer who a person's parent or sibling might be.

The report also provided additional color around Apple's and Google's relationship in this deal. However, if you've been reading AppleInsider, you already know that Gemini isn't taking over your Apple devices, nor is Apple sharing any information with Google.

Gemini-based AI doesn't mean Gemini

Previous reports that detailed Apple's deal with Google suggested that Apple would use a "white label" Gemini model capable of running in Apple's Private Cloud Compute servers to train Apple Foundation Models. The result would be a much better LLM without the need to scrape loads of data or spend billions on additional GPUs and servers.

An iPad with a colorful wallpaper covered in Apple product glyphs surrounding the letters 'ai' and a rainbow Siri glow around the display edge

Siri isn't using Gemini on your devices or in Apple's servers, it's using Apple Foundation Models

This report reaffirms what we already knew — it will be Apple Foundation Models running directly on Apple devices or via Private Cloud Compute. Google won't be involved in any part of a user's interaction, as Gemini is only being used to develop Apple's models, and isn't replacing Apple's models.

Since Apple's deal with Google isn't a one-time lease, but a potential years-long partnership, Apple allegedly has the ability to work with Google on tweaking their available Gemini model. If a specific change is needed, Apple can ask Google to make the change so that Apple can better implement a feature in its Apple Foundation Model.

One person familiar with internal testing says that the current prototype doesn't have any Google or Gemini branding. Of course, this makes sense, as the entire stack that is accessed by the end user is Apple's technology, unlike using Google Search via Siri, for example.

Apple, Google, OpenAI, and Apple Intelligence

The final portion of the report suggests that ChatGPT is the biggest loser in this situation. However, it also says that users haven't accessed ChatGPT much through the integration and hasn't driven much traffic to OpenAI.

Red running track finish line with white lane numbers 1 to 6, overlaid by several colorful abstract logos arranged vertically near the center of the lanes

Apple incorporating third-party AI systems connected to Apple Intelligence will help it win the AI race

There is zero doubt that Gemini's training of Apple's models will result in a better experience that doesn't fall back to ChatGPT as often, if ever. That said, Apple's Gemini deal doesn't completely replace ChatGPT's tie-ins to Apple Intelligence, and they seem to be made to operate in tandem.

What the report fails to mention is that users will still be able to directly ask Siri or Image Playground for ChatGPT to generate an image. ChatGPT image alteration and generation is also a feature in the new Pixelmator Pro coming in the Apple Creator Studio.

Some aspects may change, like how Visual Intelligence uses ChatGPT for describing an image. That could easily be powered by Apple Foundation Models going forward, or users could be given the option of which model to use.

What is coming in the spring is the previously promised more personal Siri and Apple Intelligence powered by app intents. This system, run by Apple Foundation Models trained by Gemini and able to call to ChatGPT as needed, could be the ultimate AI collaboration for users.

It is also private, secure, and environmentally friendly. Apple users are set to benefit from these collaborations, and Google gets to keep $1 billion of the money it gives Apple for default search placement, so it's a win-win.

Overall, the report citing Apple employees familiar with future plans doesn't give us much, if anything, truly new or revelatory. I suppose I hoped for more from a publication that charges hundreds for reports like these, even if one suggested that Tony Fadell is a contender for Apple CEO.

It's set to be an exciting few months for Apple fans. Stay tuned to AppleInsider as we sift through all the details and clarify everything that's coming to Apple devices and services.

Artificial intelligence may be a poorly named technology, but it is useful when implemented correctly. While we wait for the AI bubble to pop and things to normalize a bit, it will be interesting to see exactly where Apple lands when everything settles.