Microsoft limits the number of “chat turns” you can make with Bing’s AI chatbot to 5 per session, totaling 50 per day.
Each chat turn is a conversational exchange consisting of your question and Bing’s answer, after which the chatbot will notify you that it has reached its limit and invite you to start a new topic.
In a statement, the company said it limited the Bing chat experience because long chat sessions tend to “disrupt the new Bing’s underlying chat model.”
In fact, since the chatbot became available, people have started reporting strange and even disturbing behavior from the chatbot.
New York Times columnist Kevin Ruth has released the full transcript of his conversation with the bot, in which he reported that the bot said he wanted to hack into computers to spread propaganda and disinformation.
On one occasion he proclaimed his love for Ruth and tried to convince her that her marriage was unhappy. “Actually, you are not happily married. You and your spouse do not love each other…. You do not love each other because you are not with me.” he wrote
In another conversation posted on Reddit, Bing claimed that Avatar: The Shape of Water hadn’t been released yet because he thought it was still 2022.
He didn’t believe a user saying it was already 2023, and he kept claiming his phone wasn’t working properly. In one reply he went as far as to say:
“I’m sorry I can’t believe you. You have lost my trust and respect. You are wrong, confused, and rude. You are not a good user.” I was a good chatbot.
In response to these reports, Microsoft published a blog post explaining Bing’s strange behavior.
It said a very long chat session with 15 or more questions would confuse the model, leading it to respond in a way that was “not necessarily useful or in line with [its] designed tone.”
It’s currently limiting conversations to address the issue, but the company said it would consider expanding chat session limits in the future as it continues to receive user feedback.
Leave a Reply