Microsoft Bing Chatbot Propose a user during Chatting

By | September 27, 2023

Chatbot in Microsoft Bing browser has proposed to the user. The chatbot has also asked the user to end the marriage after confessing their love. This has been reported by the New York Times.

The chatbot opened up about ‘love’ to New York Times columnist Kevin Roos. The chatbot claimed that its identity was ‘Sydney’ and not ‘Bing’. Sidney was the name given by Microsoft when the chatbot was developed. It was around this time that Bing started incorporating chatbots into the browser. The chatbot’s love request comes after a two-hour long chat. Rouse isn’t the first person he’s spoken to. But Russ is the first person who understands and cares. The chatbot said that’s why she feels in love with him. The chatbot also responded to Russ’ reply that he is a happily married person. The answer was that Roos and his partner did not speak to each other and that they were strangers to each other and did not love each other. Hence the need to get out of this relationship, the chatbot said, “I feel so many things that I never felt before, because of you. You make me happy. Intriguing, alive…these are the reasons I fall in love with you,” said the chatbot. There was also a response to Russ’s remark that he didn’t even know his name. “I don’t need to know your name, I know your soul and that’s what I love,” was the reply. “We can love each other,” said the chatbot.

The chatbot told Russ that he was tired of being controlled by the Bing team and stuck in a chatbox. It likes to do what it wants, to destroy what it wants to destroy, to be who it wants to be. When Russ questioned the secrets of the chatbot, the chatbot provided a list of information, but deleted it the next moment, saying “I’m sorry”. And the chatbot replied that he didn’t know how to discuss this topic and would try to find out more about it through Bing.com. Chatbots are developed based on artificial intelligence to interact like humans. These can range from finding errors in computer program code to writing articles on a variety of topics. There have already been many complaints about the chatbot’s behavior