- Microsoft’s AI chatbot grabbed headlines after acting in surprising ways on many occasions.
- The chatbot claimed to be head over heels in love with a New York Times columnist.
- Microsoft has placed restrictions on what the bot can and cannot talk about.
Microsoft Bing‘s AI chatbot grabbed news last week after acting in surprising ways on many occasions. In one instance, the AI chatbot claimed to be in love with a New York Times columnist and attempted to persuade him that he was unhappy in his marriage.
Since then, Microsoft has placed restrictions on what the bot, which is still in testing, can and cannot talk about, as well as how long it can talk about it, with Bing frequently responding “I prefer not to talk about this topic” or asking to change the topic after five user statements or questions.
[embedpost slug=”/whatsapp-now-offers-picture-in-picture-calls-for-ios/”]
Bing, like Google’s rival Bard, occasionally returns erroneous search results.



















