Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple CEO Tim Cook calls AI a fundamental technology

AI is already a big part of Apple

Artificial intelligence is the industry buzzword these days, and while Apple is remaining tight-lipped on the subject, CEO Tim Cook promises it's an integral part of its products.

Apple held its quarterly earnings call on Thursday evening where journalists get a chance to ask CEO Tim Cook and CFO Luca Maestri questions about the earnings and company. While many result in negligible replies, some answers provide insight into where things may be going.

A few questions directed the call towards AI and Apple's investments in it. Apple's leadership was clear on the company's stance on AI — it wasn't just being developed, it was already part of everything.

"If you kind of zoom out and look at what we've done on AI and machine learning and how we've used it, we view AI and machine learning as fundamental technologies and they're integral to virtually every product that we ship," Cook replied to a question about generative AI. "And so just recently, when we shipped iOS 17, it had features like personal voice and live voicemail. AI is at the heart of these features."

Later in the call, Cook was asked about a roughly $8 billion increase in in R&D spend and where that money was going. Tim's response was simple, among other things, it's AI, silicon, and products like Apple Vision Pro.

These comments are in line with what Cook and the company have said about AI in the past. Plenty of features in iOS already rely on machine learning and even AI, like the new transformer language model for autocorrect.

Some assume that Apple's AI efforts will lead to a more powerful Siri. Though it isn't clear if or when such a project will surface.



11 Comments

OctoMonkey 5 Years · 343 comments

and yet...  When my iPhone does a voice-text transcription of voicemail message, it spells my name wrong.  A whole lot of intelligence there!

rmusikantow 16 Years · 107 comments

and yet...  When my iPhone does a voice-text transcription of voicemail message, it spells my name wrong.  A whole lot of intelligence there!

Hard to Spell OctoMonkey.

9 Likes · 0 Dislikes
mikethemartian 19 Years · 1504 comments

Likewise, I’m going to go out on a limb and declare that in my view that Pi and the speed of light c are fundamental constants of our universe.

2 Likes · 0 Dislikes
danox 12 Years · 3470 comments

and yet...  When my iPhone does a voice-text transcription of voicemail message, it spells my name wrong.  A whole lot of intelligence there!

Apple unlike their competition is working towards on device AI solutions and not the ET phone home AI. What’s implemented on the Apple Vision Pro begins with the M2/M3 chip working in conjunction with the R1 chip which knows what to do when you look at something or when you slightly rub your fingers together to execute a command, in short, it won’t be Googles (video boost) implementation where information is sent to Google HQ, and then bounced back to you after being scrubbed of personal information, I think the AI path Apple is taking is more private to the end user (on device).

On device, AI is more complicated because it actually requires a SOC chip combined with OS level software, and conversely requires a hell of a lot more time to design, engineer, and develop, designing something that works on a super computer back home at Google/Microsoft HQ does not which is why Google resorted to that method to cover for the shortcomings of the Tensor SOC in the Pixel 8 Pro which is five years behind Apple.

6 Likes · 0 Dislikes
OctoMonkey 5 Years · 343 comments

danox said:
and yet...  When my iPhone does a voice-text transcription of voicemail message, it spells my name wrong.  A whole lot of intelligence there!

Apple unlike their competition is working towards on device AI solutions and not the ET phone home AI. What’s implemented on the Apple Vision Pro begins with the M2/M3 chip working in conjunction with the R1 chip which knows what to do when you look at something or when you slightly rub your fingers together to execute a command, in short, it won’t be Googles (video boost) implementation where information is sent to Google HQ, and then bounced back to you after being scrubbed of personal information, I think the AI path Apple is taking is more private to the end user (on device).

On device, AI is more complicated because it actually requires a SOC chip combined with OS level software, and conversely requires a hell of a lot more time to design, engineer, and develop, designing something that works on a super computer back home at Google/Microsoft HQ does not which is why Google resorted to that method to cover for the shortcomings of the Tensor SOC in the Pixel 8 Pro which is five years behind Apple.

Let us say your phone is setup with the owner contact information being Brad Lee Smith.  If somebody leaves a voice-mail message starting with "Hello Brad Lee", the transcription should not read "Hello Bradley".  This is not a particularly complicated issue to understand.  The old Knowledge Navigator concept from 35 years ago showed this type of "AI" in action.

3 Likes · 0 Dislikes