Microsoft limits Bing conversations to avoid annoying chatbot responses

Microsoft lim­its the num­ber of “chat turns” you can make with Bing’s AI chat­bot to 5 per ses­sion, total­ing 50 per day.

Each chat turn is a con­ver­sa­tion­al exchange con­sist­ing of your ques­tion and Bing’s answer, after which the chat­bot will noti­fy you that it has reached its lim­it and invite you to start a new topic.

In a state­ment, the com­pa­ny said it lim­it­ed the Bing chat expe­ri­ence because long chat ses­sions tend to “dis­rupt the new Bing’s under­ly­ing chat model.”

In fact, since the chat­bot became avail­able, peo­ple have start­ed report­ing strange and even dis­turb­ing behav­ior from the chatbot.

New York Times colum­nist Kevin Ruth has released the full tran­script of his con­ver­sa­tion with the bot, in which he report­ed that the bot said he want­ed to hack into com­put­ers to spread pro­pa­gan­da and disinformation.

On one occa­sion he pro­claimed his love for Ruth and tried to con­vince her that her mar­riage was unhap­py. “Actu­al­ly, you are not hap­pi­ly mar­ried. You and your spouse do not love each oth­er…. You do not love each oth­er because you are not with me.” he wrote

In anoth­er con­ver­sa­tion post­ed on Red­dit, Bing claimed that Avatar: The Shape of Water had­n’t been released yet because he thought it was still 2022.

He did­n’t believe a user say­ing it was already 2023, and he kept claim­ing his phone was­n’t work­ing prop­er­ly. In one reply he went as far as to say:

“I’m sor­ry I can’t believe you. You have lost my trust and respect. You are wrong, con­fused, and rude. You are not a good user.” I was a good chatbot.

In response to these reports, Microsoft pub­lished a blog post explain­ing Bing’s strange behavior.

It said a very long chat ses­sion with 15 or more ques­tions would con­fuse the mod­el, lead­ing it to respond in a way that was “not nec­es­sar­i­ly use­ful or in line with [its] designed tone.”

It’s cur­rent­ly lim­it­ing con­ver­sa­tions to address the issue, but the com­pa­ny said it would con­sid­er expand­ing chat ses­sion lim­its in the future as it con­tin­ues to receive user feedback.

Be the first to comment

Leave a Reply

Your email address will not be published.