Microsoft says it’s implementing some conversation limits to its Bing AI just days after the chatbot went off the rails multiple times for users. Bing chats will now be capped at 50 questions per day and five per session after the search engine was seen insulting users, lying to them, and emotionally manipulating people.
“Our data has shown that the vast majority of people find the answers they’re looking for within 5 turns and that only around 1 percent of chat conversations have 50+ messages,” says the Bing team in a blog post. If users hit the five-per-session limit, Bing will prompt them to start a new topic to avoid long back-and-forth chat sessions.
Microsoft warned earlier this week that these longer chat sessions, with 15 or more questions, could make Bing “become repetitive or be prompted / provoked to give responses that are not necessarily helpful or in line with our designed tone.” Wiping a conversation after just five questions means “the model won’t get confused,” says Microsoft.
Reports of Bing’s “unhinged” conversations emerged earlier this week, followed by The New York Times publishing an entire two-hour-plus back-and-forth with Bing, where the chatbot said it loved the author and somehow they weren’t able to sleep that night. Many smart people have failed the AI Mirror Test this week, though.
Microsoft is still working to improve Bing’s tone, but it’s not immediately clear how long these limits will last. “As we continue to get feedback, we will explore expanding the caps on chat sessions,” says Microsoft, so this appears to be a limited cap for now.
Bing’s chat function continues to see improvements on a daily basis, with technical issues being addressed and larger weekly drops of fixes to improve search and answers. Microsoft said earlier this week that it didn’t “fully envision” people using its chat interface for “social entertainment” or as a tool for more “general discovery of the world.”