More on News
OpenAI has confirmed that the login credentials of the ChatGPT users were compromised. This cybersecurity breach allowed the threat actors to gain unauthorized access. ArsTechnica had initially reported that ChatGPT leaked private conversations, including sensitive details like usernames and passwords.
However, OpenAI refuted the claims made by ArsTechnica as inaccurate and further clarified its fraud, and security teams were actively investigating the data breach.
According to OpenAI, the compromised ChatGPT user login credentials allowed a malicious actor to gain unauthorized access and misuse affected accounts, leading to the chat history and files leak.
The user, whose account was reportedly compromised, initially did not acknowledge any unauthorized access to their account. OpenAI underscores that ongoing investigations will provide more insights into the scope of the breach and determine the necessary steps to address the security issue.
ArsTechnica reported that ChatGPT showed private conversations, raising concerns regarding the potential exposure of sensitive information. The leaked conversations include details from a support system used by pharmacy prescription drug portal employees. This exposed troubleshooting issues, app names, store numbers, and additional login credentials. Additionally, another leaked conversation disclosed details about a presentation and an unpublished research proposal.
This incident adds to a sequence of previous security issues associated with ChatGPT. In March 2023, there were reports of a bug leading to the leakage of chat titles. Subsequently, in November 2023, researchers successfully extracted private data for training the Language Model by manipulating queries.
Users are advised to exercise caution when using AI bots like ChatGPT, especially those created by third parties. The absence of standard security features, such as two-factor authentication (2FA) or the ability to review recent logins on the ChatGPT site, raises concerns about the platform’s security measures.