ChatGPT has become one of the artificial intelligence applications of the moment. Today, millions of people use it for various purposes, from summarizing documents and creating text to discovering bugs in your code.
However, even if they have accepted the terms of service, not all users know that their conversations are not as private as they imagine. And this is only part of the data collected by OpenAI, the American company behind the famous chatbot.
What ChatGPT collects from its users
When we use messaging services, email clients and word processors, to name a few examples, we often assume that the data we enter is protected. That is, they can only be seen by the people to whom they are directed.
Things in ChatGPT are different. We are facing an experimental system that has been released to the public when it is still in full development and is not suitable for sharing secrets or confidential information. Precisely, one of the key elements to improve it is the conversations.
“As part of our commitment to safe and responsible AI, we review conversations to improve our systems and ensure content meets our security policies and requirements,” OpenAI says in a FAQ on its website.
In the same document, he points out that the conversations can even be reviewed by his “AI trainers” to improve his systems. In other words, what is typed in the ChatGPT chat box can also be seen by company employees.
The company run by Sam Altman indicates that when using its services it may collect the following personal information:
- Contact information
- account credentials
- Payment information
- transaction history
- Input and upload of files
- internet protocol
- Browser type and settings
- Date and time of requests
- Form of interaction with OpenAI websites
- Type of device
As we say, the aforementioned personal information can be used to improve existing OpenAI products and services, as well as create new ones. However, there is also the possibility that this data ends up in the hands of third parties. And there are a few reasons why this could happen, always sticking to what the company says. Let’s see.
Vendors and Service Providers. They say they may provide personal information to these outside parties to help meet their business needs.
Business transfers. In the event that OpenAI is sold and taken over by another company, reorganizes, files for bankruptcy, or other similar scenarios, data may be transferred.
Legal requirements. Here the company says that the data can also be transferred in cases where they must comply with legal obligations.
Affiliates. They also note that they may share personal information with an entity that controls, is controlled by, or is under common control with OpenAI.
As we can see, the company’s data collection system is not exactly small and it is important to keep this in mind when using its services. This has also raised the alarm in some countries. Italy, without going any further, has banned ChatGPT and other European countries such as Germany and France could also follow suit.