Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

When you report bugs on iOS, some content may be used for AI training

If you want to report a bug on iOS, content you upload may be used for AI training.

If you decide to report a bug on a beta version of iOS, you now apparently have to let Apple use the uploaded content for Apple Intelligence training with no way to opt out.

On Monday, Apple announced its plans for a new opt-in Apple Intelligence training program. In essence, users can let Apple use content from their iPhone to train AI models. The training itself happens entirely on-device, and it incorporates a privacy-preserving method known as Differential Privacy.

Apple took measures to ensure that no private user data is transmitted for Image Playground and Genmoji training, as Differential privacy introduces artificial noise. This makes it so that individual data points cannot be tracked to their source.

Even so, some users are unhappy about the opt-in AI training program. While Apple said that it would become available in a future iOS 18.5 beta, one developer has already noticed a possible related change to the Feedback app.

In a social media post, developer Joachim outlined a new section of Apple's privacy notice in the Feedback application. When uploading an attachment as part of a bug report, such as a sysdiagnose file, users now need to give Apple consent to use the uploaded content for AI training.

Submit report notification with text about permissions for uploading content to Apple for improving products, featuring options to go back, don't ask again, or submit. The privacy notice of the Feedback app now references AI training.

"Apple may use your submission to improve Apple products and services, such as training Apple Intelligence models and other machine learning models," the notice reads, in part.

The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.

They blasted Apple, expressing frustration that the iPhone maker decided to "hide it in the other privacy messaging stuff," and made it very clear that this is something they did not want.

Still, this is only one developer's reaction, with a few others chiming in with replies. It remains to be seen whether other devs will echo the sentiment, but it's likely given that there's no apparent opt-out button.

Anyone who wants to file a bug report will have to consent to Apple's AI training, and people will understandably be upset, even with privacy-preserving measures in place.

You can opt out of AI training, but not when reporting bugs

As time goes on, Apple's AI training program will expand to other areas of the iPhone operating system beyond bug reporting. The company wants to use Differential Privacy-based AI training for Genmoji, Image Playground, and Writing Tools.

Smartphone screen showing analytics settings with toggles for sharing iPhone, Watch, and iCloud analytics enabled. Text explains data sharing purposes and privacy information. It's possible to opt out of Apple's AI training program, but not if you want to report bugs on a beta version of iOS.

While nothing has yet been implemented on that front, users are able to opt out of the on-device Apple Intelligence training program by turning off analytics in Settings.

This can be done by scrolling down and selecting Privacy & Security, then Analytics & Improvements. Those who wish to opt out can do so by toggling the "Share iPhone & Watch Analytics" setting.

There appears to be no way for developers to opt out of training Apple's AI with their bug reports at this time.

11 Comments

DAalseth 7 Years · 3290 comments

The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.
If this is true it would adversely impact bug reporting. A lot of us are working very hard to not let our data be used for AI training. Now we may be in possession of an important bug report but have to violate our principles if we want to report it. Many may decide not to.  If this is true, it would be a very stupid move on Apple’s part.

1 Like · 4 Dislikes
swat671 10 Years · 169 comments

I fail to see an issue here. What IS the issue, exactly?

2 Likes · 0 Dislikes
swat671 10 Years · 169 comments

DAalseth said:
The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.
If this is true it would adversely impact bug reporting. A lot of us are working very hard to not let our data be used for AI training. Now we may be in possession of an important bug report but have to violate our principles if we want to report it. Many may decide not to.  If this is true, it would be a very stupid move on Apple’s part.

Again, what’s the issue? What “principles”? You kinda sound like those vegans who just like to hear themselves talk and make themselves sound self-important. 

3 Likes · 1 Dislike
retcable 5 Years · 3 comments

As others, I fail to see a problem here.    Apple's constant drumbeat about privacy and security of customers' information is the reason Siri and Apple Intelligence are pretty much useless today.   Most Apple users do not allow the collection and use of any personal information, and this is based on tech pundits and Apple recommending over the years that those users turn off any data-collection settings in all their devices.     How is Siri supposed to learn anything about you, where you are, what you're trying to do, what you like to do, where you go, where you work, where you like to eat, shop, and play, etc etc if you do not allow it to collect any information in regards to any of those things?     This information can be collected anonymously but few Apple users allow it.
Google on the other hand, knows everything about you, where you live, where you go, where you eat, shop, play, and work.  They collect all this data every time you use a Google-related app, whether you have allowed them to do so or even if you know about if or not.    And the result is that Google's digital assistant is FAR more useful and correct in everything it does, because it KNOWS you.    There is just no comparison in the utility of Google's assistant and Siri's ignorance.   And it is because Siri knows NOTHING about you.

3 Likes · 1 Dislike