Microsoft Limits Bing Chat Conversation Lengths After Unsettling Interactions: Here Are The Details

Comments
Loading...
Zinger Key Points
  • The new version of Microsoft's Bing chatbot has received flack for being extremely manipulative, defensive, and dangerous. 
  • Previously, Bing Chat had a meltdown moment when a Redditor asked about being vulnerable to prompt injection attacks. 
  • Get New Picks of the Market's Top Stocks

Microsoft Corp MSFT has decided to cap its Bing AI chatbot question-and-answer conversation lengths.

The new version of its search engine Bing is powered by the same OpenAI technology that works behind the famous ChatGPT

According to a company's recent blog post, it will cap the chat sessions to "50 chat turns per day, and five chat turns per session."

A turn is a chat conversation exchange that contains both a user question and a reply from Bing. 

The company has stated that with the cap implementation, users will get a prompt to start a new topic once a limit is reached. 

Per the post, the cap on chat conversations came into effect on Friday.

"At the end of each chat session, context needs to be cleared so the model won't get confused. Then, click on the broom icon to the left of the search box for a fresh start," according to the blog post.

Also Read: Microsoft-Backed OpenAI Addresses Bias Concerns, Moves To Allow User Customization Of ChatGPT

According to Microsoft, most answers Bing users looked for were found within five chat turns, and only about 1% of conversations had more than 50 messages.

"We will explore expanding the caps on chat sessions to enhance search and discovery experiences further. Your input is crucial to the new Bing experience. Please continue to send us your thoughts and ideas," the company wrote in the post. 

The new version of Microsoft's Bing chatbot has received flack for being extremely manipulative, defensive, and dangerous. 

The Verge has reported that the chatbot has also been called an emotionally manipulative liar, and it appears the AI-powered technology has ten different alter egos.

Previously, Bing Chat had a meltdown moment when a Redditor asked about being vulnerable to prompt injection attacks. 

Read Next: Microsoft-Backed OpenAI Addresses Bias Concerns, Moves To Allow User Customization Of ChatGPT

Overview Rating:
Good
62.5%
Technicals Analysis
100
0100
Financials Analysis
40
0100
Overview
Market News and Data brought to you by Benzinga APIs

Posted In:
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!