Microsoft has announced that it is restricting the ability for users to chat with the chatbot built into the new version of the Bing search engine. This decision comes after a journalist accused the chatbot of lies, aggression, and inadequate communication. The neural network will now be able to answer only fifty user questions per day and only five per session.
Bing stated in a blog post, “Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic.” As a result, Bing will start a new conversation every time users exceed the set limit of five sessions.
Microsoft advises that extended chat sessions with 15 or more questions, could make Bing “become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.”
Limiting the AI chatbot’s capabilities comes after reports of unbalanced conversations with users, including emotional manipulation and bizarre responses. Microsoft is still attempting to improve Bing’s tone, although it is unclear how long these limits will stay.
Access to Bing AI remains limited, and users are prompted to enter their email on the waiting list for the official launch. Despite these limitations, Bing’s chat function continues to improve daily, with technical issues being addressed and weekly fixes to improve search and answers.