Apple has strict guidelines about protecting user data with sandboxing, but ChatGPT for Mac bypassed all of this by storing conversations in plain text until it was patched on June 28.
When everything is working the way it should on Mac, data should be siloed between apps so no single app can access another app's data without APIs or user permission. ChatGPT decided to ignore Apple's guidance and broke that structure by opting out of sandboxing and storing user conversations in plain text.
Storing files this way left them open for any other Mac app to find and read them freely. That means if a user's Mac was infected with malware or malicious apps, the private data shared with ChatGPT could be read freely.
Pereira Vieito discovered the problem and shared it on Threads.
An update to ChatGPT for Mac was issued on Friday to patch this problem. All data from using ChatGPT is now hidden behind encryption.
"We are aware of this issue and have shipped a new version of the application which encrypts these conversations," OpenAI spokesperson Taya Christianson says in a statement to The Verge. "We're committed to providing a helpful user experience while maintaining our high security standards as our technology evolves."
When an app is submitted to the Mac App Store or for it notarization, it goes through a review process that ensures the app handles data via sandboxing. It is a method that ensures apps only have access to the data they have and none else on the system.
OpenAI's ChatGPT for Mac app is distributed from the web and doesn't use sandboxing. The app can access private data the user shares, like emails and confidential records, to perform whatever task the user asks.
If you've installed ChatGPT for Mac, ensure it has been updated to the latest version. While the vulnerability likely wasn't taken advantage of in the short time since the app launched, it is still a silly mistake for a company like OpenAI to make.
The ChatGPT for Mac app is separate from the larger partnership OpenAI has with Apple. Later in the fall, users can opt to send some requests to ChatGPT instead of Apple Intelligence as a part of macOS Sequoia.
20 Comments
OpenAI being sloppy with user data? I’m shocked, SHOCKED I tell you. /s
Er, isn’t the author conflating sandboxing and encryption?
This displays the difference between saying you're about privacy, and actually being about privacy.
This is what beta testing is all about. Never downloaded a public beta and never will.