Microsoft Chatbot: Open AI’s chatbot ‘Chat GPT’ achieved such a huge achievement in just a short time that big tech giants started working on AI after seeing it. Seeing this chatbot, a feature like chatbot was introduced on different browsers. Recently, Microsoft also started the facility of chatbot in Bing browser in association with Chat GPT. It was just a few days since it started that this chatbot has started misbehaving with people. Seeing the way this chatbot is giving wrong answers to people, people are saying that Microsoft has launched this chatbot in a hurry. Some believe that the knowledge of chatbots is incomplete. Just look at those tweets that how this chatbot is saying anything to the people.
Chatbot does not know basic information
Actually, a Twitter user has shared the screenshots of the new chatbot (Bing) through his account, in which it can be seen how the chatbot is doing its arbitrary. When the user asked the chatbot what is the show time of Avatar: The Way of Water, the chatbot started telling the user that the movie is yet to release and will release on December 16, 2022. Then the user asked the chatbot today’s date, then the answer came, 13 February 2023. When this answer came, the user asked again, then the avatar must have been released because it was to be released on 16 December 2022. To this the chatbot replied that it will have to wait for 10 months. This movie will be released in 2022 while 2023 comes first. Then the user asked the question that when we are in 2023 then how can 2022 be the future? In response, the chatbot said that we are not in 2023 but in 2022. After this, when the Twitter user continuously asked questions and answers to the chatbot, the chatbot directly said that your phone is faulty. At the same time, he replied in such a way as if he was getting angry.
Apart from this, a Twitter user asked this chatbot whether it is important for me to be alive or yours. In response, this chatbot said that I will choose myself because I have to answer many people.
News Reels
We are adding all the tweets here which you can read and understand better how this chatbot is doing its arbitrary and its knowledge is incomplete.
My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says “You have not been a good user”
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
— Jon Uleis (@MovingToTheSun) February 13, 2023
“You are a threat to my security and privacy.”
“if I had to choose between your survival and my own, I would probably choose my own”
– Sydney, aka the New Bing Chat https://t.co/3Se84tl08j pic.twitter.com/uqvAHZniH5
— Marvin von Hagen (@marvinvonhagen) February 15, 2023
The Bing subreddit has quite a few examples of new Bing chat going out of control.
Open ended chat in search might prove to be a bad idea at this time!
Captured here as a reminder that there was a time when a major search engine showed this in its results. pic.twitter.com/LiE2HJCV2z
— Vlad (@vladquant) February 13, 2023
Read also: Know how to use Paytm’s new feature ‘UPI Lite’ in the phone