X
Business

AI-powered Bing gets confused if you ask too many questions, says Microsoft

Bing Chat won't be able to hold a conversation for so long because Microsoft doesn't want it being confused by all our questions.
Written by Liam Tung, Contributing Writer
pc-looking-at-screen
Image: Tara Moore/Getty Images

Microsoft has put some new constraints on its AI-powered Bing Chat tool to stop humans confusing the model with too many prompts. 

There are incidents reported where Bing Chat has started off with friendly chat, but this has given way to bizarre conversations and arguments: Microsoft and OpenAI announced last week that they'd offer users more control over Bing Chat and ChatGPT, respectively. 

In Depth: These experts are racing to protect AI from hackers. Time is running out

OpenAI said it would offer users control over ChatGPT's values, while Microsoft said it would offer more control because the Bing Chat model, which uses ChatGPT technologies, "at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn't intend." 

Microsoft found that chat sessions involving 15 or more questions cause Bing to become repetitive or prone to being 'provoked'.

As a result, Bing Chat will now be limited to 50 "chat turns" per day and 5 "chat turns" per session. Chat turns involve a user question and a reply from Bing. 

After the limit is reached, users will be prompted to start a new topic and the context is cleared from the model to avoid it becoming confused. 

Also: The best AI chatbots: ChatGPT and other fun alternatives to try

"Our data has shown that the vast majority of you find the answers you're looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won't get confused. Just click on the broom icon to the left of the search box for a fresh start," Microsoft's Bing team said. 

Last week it revealed two key lessons from the week-long experimentation with users in the limited preview. 

Due to long chat sessions confusing the model, it said "we may need to add a tool so you can more easily refresh the context or start from scratch". 

Also: What is ChatGPT and why does it matter? Here's everything you need to know

Microsoft hopes that Bing Chat will redefine what online search is all about. So far, Microsoft says that millions of people have signed up to the waitlist for the new Bing search. It is prioritizing access for users who install Edge and Bing as their default browser and search engine. 

Not everyone is happy about the change: some users have complained that the five-turn limit defeats the purpose of Bing Chat. Microsoft-backed OpenAI has not not put similar constraints on ChatGPT. 

Yusuf Mehdi, Microsoft corporate vice president and consumer chief marketing officer, said Microsoft is taking feedback and will continue to iterate and improve the feature. 

Editorial standards