Browsing the Internet and using applications should always be done with caution and the utmost care.
And this also applies when you use artificial intelligence applications. Especially there you should be careful with the information you share.
One of the questions that has arisen is what will happen to the photos we upload to AI applications.
Here are five types of information you should never share with ChatGPT or any other AI chatbot to protect your privacy and security, according to tn.com.
Images with biometric data
Images of your face can expose biometric data to artificial intelligence and the company behind it. While it's fun to create Studio Ghibli-style illustrations or whatever, it's important to know that these images can be used to collect biometric information.
Privacy
Information such as ID number, date of birth, address or usual routes should be kept private.
Medical results
Sharing sensitive medical information, such as diagnoses or test results, is not only unnecessary, but can end up being dangerous. ChatGPT is not a doctor and cannot provide accurate diagnoses even if you ask for it, so it is best to keep this information away from such apps.
Bank details
While it may seem obvious, it's worth suggesting that you never share credit card numbers, bank account passwords, or financial transaction details, whether with AI or anyone else online. While ChatGPT is designed to respect privacy, it doesn't offer the same level of security as a banking website.
Confidential company information
If ChatGPT is used for work purposes, confidential company information should not be shared.
Source: iefimerida.gr