The booming world of artificial intelligence (AI) has brought many exciting possibilities, but also significant worries about user privacy. Data breaches and copyright infringement are ongoing battles, leaving many anxious about their information falling into the wrong hands.
These concerns were recently amplified after a critical security flaw was discovered in the new ChatGPT app for macOS. The app, released on June 25th, was found to be storing user conversations in plain text, making them accessible to any other app or user on the Mac (as reported by AppleInsider). This vulnerability existed until a patch was released on June 28th. The issue was first identified by a user named Pereira Vieito, who detailed the problem on the social media platform Threads.
Post by @pvieitoView on Threads
Apple’s security guidelines mandate that apps store data in “sandboxes.” These isolated environments ensure that app data remains inaccessible to other programs without explicit user permission. This includes protecting sensitive information like photos, calendars, and text messages. However, OpenAI, the developer behind ChatGPT, disregarded this crucial security measure and opted to store conversations unencrypted and readily available.
The vulnerability extended beyond third-party apps. Any malware that infiltrated a Mac could have easily scooped up all the user’s conversations with ChatGPT. This could have had severe consequences, considering the potentially sensitive nature of the information people might share with the chatbot.
When an app goes through the process of being submitted to Apple’s App Store, it undergoes a security check called notarization. During this process, Apple meticulously examines the app against various criteria, including proper sandboxing to ensure data remains inaccessible to external programs.
The crux of the problem lies in the distribution method. The ChatGPT Mac app wasn’t available through the App Store, but rather from OpenAI’s website. Consequently, it bypassed Apple’s notarization process, creating a security loophole.
OpenAI acknowledged the issue in a statement to The Verge: “We are aware of this issue and have shipped a new version of the application which encrypts these conversations. We’re committed to providing a helpful user experience while maintaining our high security standards as our technology evolves.”
While the spotlight falls on the ChatGPT Mac app in this instance, it serves as a stark reminder: any app downloaded from outside the App Store and not notarized by Apple could pose a similar security risk. This emphasizes the importance of installing apps only from trusted sources and exercising caution when sharing sensitive information within such apps, to prevent a similar situation from arising in the future.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
