Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple's 'Ask' project may be far more than just an AI-assisted support tool

Apple is working on AI tools


Last updated

A report that revealed Apple's "Ask" tool may not have been the whole story, with a leaker exclusively telling AppleInsider that the project goes beyond a simple language learning model or generative AI tool.

Sparse details about a new tool called "Ask" being tested by Apple employees surfaced on February 23. Since then, we've obtained more information about the project.

On late Sunday, a leaker reached out to AppleInsider with a bit more information on the Apple "Ask" tool. The leaker claims that "Ask" is "not a LLM or other generative AI like some think."

The leaker leans into this, saying that because the support knowledge database, and the front-end to that database for support members, are constantly evolving, it needs to be far more than that. They go on to say that it is intended to be an advanced natural language search engine, to assist support users.

Despite spending all of Monday and a good part of the overnight into Tuesday trying to breach Apple's wall of secrecy around the project, we obviously can't absolutely confirm the provenance of the information, and we were not provided a means to ask amplifying questions.

Efforts to get more information continue, because the technology at its core seems like an obvious addition to a future series of OS releases. Should we get more information on Tuesday or Wednesday, we will update this article accordingly.

What is Apple Ask?

Apple launched a pilot program that provides select AppleCare support advisers an AI tool called "Ask." It is a tool that automatically generates responses to technical questions based on information from Apple's internal database.

Unlike a simple search tool, which returns the same results every time based on relevance, the "Ask" program generates an answer based on specifics mentioned in the query, like device type or operating system. Advisors can mark these answers as "helpful" or "unhelpful."

Given that as of late, chatbots have started feeding from other chatbots, they tend to make things up with high confidence. This is called "hallucination" — and is obviously bad for Apple employees providing help for consumers.

The "Ask" tool attempts to avoid this behavior by being trained only on its internal database with additional checks that ensure responses are "factual, traceable, and useful."

There's a good chance this leaked "Ask" tool either is or is based on the previously leaked "Ajax." It is an internal tool that some allegedly referred to as "AppleGPT."

Tim Cook directly said that Apple was working on AI tools for likely release at some point in 2024. Even though nothing has been announced, the company is likely working on and testing many tools that rely on generative models similar to how ChatGPT operates.

Apple's push into AI and what it might mean for iOS 18 has yet to be made clear. WWDC in June will likely have details.



15 Comments

Massiveattack87 1 Year · 102 comments

What Apple is advertising with their Generative AI is nothing more than what Google Pixel phones or S24 already have, which is already dedicated Machine Learning hardware and ML models that run on the phone s to do things like real time transcribing, image recognition etc.
These are not LLMs nor comparable to what ChatGPT has already brought out.

As LLM usage is quite limited in terms of their capacities and storages running on the phone, this function is just limited to AppleCare (all necessary data for AppleCare can be handled with their chip capacitiy). 

At the end, something interesting and exciting (if!) only for AppleCare customers. 

1 Like · 0 Dislikes
eriamjh 18 Years · 1778 comments

An intelligent help system for the obscure errors or just plain illogically buried features in iOS and MacOs would be great but that isn’t going to help Apple sell phones or Macs.  

They need a useful productive AI tool.   Smarter Siri.  Smarter searching.  Smarter interaction.  

Smarter voice recognition.   I’d rather have a tool ask me to clarify something than make a guess of what I said and have it be wrong.  

Set a timer for 15 minutes. 
“Did you say 15 minutes or 50 minutes?”
The first.
”ok”

Or notice that I usually as for 15 and decide based on past history and typical times and when I usually ask.  

If Apple rolls out an awesome AI tool and tells everyone, maybe it will help their stock price.  

4 Likes · 0 Dislikes
macbootx 11 Years · 72 comments

Apple has had a distinct vision for an “intelligent agent” going back to 1987. It was called “Knowledge Navigator”. The technology has advanced to the point where this concept may soon become reality. 

3 Likes · 0 Dislikes
avon b7 21 Years · 8062 comments

There is an incredible amount of help related information just begging to be wrapped up in an 'intelligent' user friendly way.

Even the most mundane things like 'how much will it cost to change the battery on this phone?' and then get instant, up-to-date pricing, turnaround times in-store or shipping options etc for your nearest geographical options. 

1 Like · 0 Dislikes
michelb76 9 Years · 706 comments

What Apple is advertising with their Generative AI is nothing more than what Google Pixel phones or S24 already have, which is already dedicated Machine Learning hardware and ML models that run on the phone s to do things like real time transcribing, image recognition etc.
These are not LLMs nor comparable to what ChatGPT has already brought out.

As LLM usage is quite limited in terms of their capacities and storages running on the phone, this function is just limited to AppleCare (all necessary data for AppleCare can be handled with their chip capacitiy). 

At the end, something interesting and exciting (if!) only for AppleCare customers. 

Which is a perfectly limited use case for getting a model like this off the ground on iPhone. Easy to test, and I bet most of the data was already tagged in a beneficial way for learning. The other models already running on iPhone since a few years are a bit harder to get right, like the Moments, image tagging and object/person/pet recognition in Photos, etc.