A new report highlights Apple's ongoing struggles to modernize its smart assistant, Siri, at a time when artificial intelligence chatbots are poised to reign supreme.
It's been suggested that Apple has decided to step up its game and focus on natural-language search processing, but the shift may be causing trouble behind the scenes.
A new report from The Information on Thursday morning takes a deep dive into Apple's forays into artificial intelligence — specifically regarding Siri.
It suggests that a series of unfortunate events and departures has led to a lack of confidence among employees.
In late 2022, three of Apple's engineers responsible for AI implementation departed to work for Google. They believed that Google was a better environment for working on large-language models, or LLMs, which allow smart chatbots to provide users with humanlike responses to questions. The departure was a blow to Apple's artificial intelligence teams, already crippled by existing issues.
Over three dozen former Apple employees cited organizational dysfunction and a lack of drive as hindrances to the company's AI and machine-learning efforts. This was especially true when working on Siri, Apple's highest-profile AI technology.
While Apple's relatively small gains in artificial intelligence haven't hurt the company, the tech giant likely knows it can't entirely ignore the shift to LLMs.
However, employing nascent technology can be a risk. ChatGPT is notorious for producing answers riddled with inaccuracies, if not outright false. Apple, more so than its competitors would likely want to maintain its brand image.
Thursday's report also points out that Apple may be reticent to implement LLMs because they would necessarily require queries to be processed in the cloud. Apple has spent years moving many of Siri's functions on-device, which helps maintain users' privacy when using the voice assistant.
Apple's senior vice president of Machine Learning and AI Strategy, John Giannandrea, joined Apple in 2018. At that time, the company had limited protocols in place for collecting data on how users interfaced with Siri. This was partially because of privacy concerns and because the company had decided that the metrics weren't worth the investment.
While Giannandrea attempted to expand the research, ultimately, executives paused the project when the media reported that third-party contractors listened to Siri recordings without users consent.
Instead, the company focused on curating responses that helped maintain its brand image. According to former Apple employees, most of Siri's "canned" responses are written or reviewed by — humans. This helps to reduce potentially embarrassing, if not outright worrisome responses.
What direction Apple takes with Siri remains to be seen. Some reports claim that the company is already researching ways to safely implement improved intelligent capabilities for user queries. Still, much like its foray into virtual reality, it's likely that Apple won't put out any intelligent search features until its ready.
12 Comments
There's nothing surprising here, because Apple's course is the more difficult one. As noted in the article, there are two giant hurdles for Apple's implementation of Siri that aren't so big for others. The first is privacy, and the second is quality. Moving more Siri functions out of the cloud and on-device means processing has to be much tighter. Limiting the personal data that Apple collects means there are fewer contextual points that Siri can use when responding to a query. On the quality side, Siri is far less likely than competitors to offer an incorrect answer, but far more likely to offer a web search or to simply confess to not being able to produce the answer.
Interestingly, this is analogue to your choices when hiring a human personal assistant. Do you want an assistant who is respectful and discreet, or one who knows more about you because they regularly rifle through all your stuff and share info about you with friends and paying customers? That second option will already know you might want to order some flowers, but that's because they listened to every bit of that argument you had with you wife this morning. Do you want an assistant who asks for clarification and is willing to admit when they simply don't know the answer, but can help you find it, or one who always tells you what you want to hear, whether or not their info is correct? The first one can be a little frustrating, but that second one is willing to send you out there misinformed.
I know I'd rather have the assistant who is discreet and self-confident enough to be honest about what info they can provide.
Maybe i see it from a very simple and uninformed point of view. But couldn’t Apple give users a choice? Choose on device Siri for more privacy but with less features or cloud based Siri that can handle more things?
So scared of making a mistake that management let the lead Apple had wither. Embarrassing.
Apple will need to figure it out quick. They have spent the past 7-10 years working on cars and VR which remain to be seen let alone anything more than a niche product. Meanwhile in a matter of months chatGPT has revolutionized the way we do things and Apple was caught completely flat footed. I use chatGPT daily and cringe when I need to ask SIRI to do something on my phone. The fact that Google and Microsoft are adding AI throughout their software should be a major concern for Apple.