After the introduction of AI chatbots like ChatGPT, many tasks have become easier. Many people have now started leaving Google search and getting answers to their questions directly from chatbots. For this, many times people also share some sensitive information with the chatbot, which can be dangerous. There are many dangers if this information falls into wrong hands. Let us know today what information should not be shared while talking to ChatGPT or any chatbot.
Never give personal information
Never share your personal information while using chatbot. If someone shares personal information like name, number and address, it can be used to track the activity. Similarly, do not share the password with the chatbot. This can cause huge losses.
Avoid giving financial information
Never share information about your bank account number and credit card etc. on chatbots. If these fall into the wrong hands, there is a risk of financial loss. Nowadays cyber criminals are trying to loot people’s money in different ways.
Medical Information
Many people share their medical history and other information on chatbots to get medical advice. Doing this can be dangerous. With this, companies start tracking and showing targeted ads.
Obscene content
Never use obscene content while interacting with the chatbot. Many chatbots store history and using such content can cause embarrassment in front of people. At the same time, many chatbots block users with such content, due to which they will never be able to use that chatbot again. Therefore always avoid sharing such content.
Read this also-
WhatsApp, Telegram and Instagram users beware! Most frauds are happening on these three apps, government alerts
Source link