5 Big Flaws With ChatGPT That Still Haven’t Been Solved – SlashGear
Expecting privacy in today’s web scape almost seems idealistic, and OpenAI’s data collection practices do not negate that notion. Cybersecurity firm Surfshark recently conducted a privacy analysis that revealed that OpenAI trained its ChatBot with a wealth of user data for which they didn’t have the appropriate legal clearance.
The allegation is that OpenAI may have gathered user information without their permission, in violation of the General Data Protection Regulation (GDPR). Additionally, it looks like ChatGPT did not alert the people whose data was used to train the AI tool, which is another GDPR breach. Data controllers are required by law to provide users with information regarding the collection and use of personal data so that they have the choice to opt out.
It’s this same violation that led to ChatGPT’s temporary ban in Italy, although OpenAI has remedied the issue with a form that allows users within the European Union to opt out of having their data used for training purposes. Weirdly, that option is not available to users from other regions, which means that ChatGPT may still collect and store personal information from your queries/prompts, data uploads, and any feedback you provide if you’re not from the EU. It can also share any of this info with third parties without your knowledge, as stated in its privacy policy.
There have been other instances of privacy breaches as well. In March 2023, a bug allowed some ChatGPT users to see conversations from other active users’ chat history, as well as the payment information of some Premium subscribers. In the following month, Gizmodo reported that Samsung’s employees unwittingly leaked confidential company information via ChatGPT. There have been no changes to OpenAI’s privacy policy since the incident, so these problems do not seem to be going away soon.
For all the latest Gaming News Click Here
For the latest news and updates, follow us on Google News.