Microsoft admits long conversations with Bing’s ChatGPT mode can send it haywire


Microsoft’s new ChatGPT-powered Bing has gone haywire on several occasions during the week since it launched – and the tech giant has now explained why.

In a blog post (opens in new tab) titled “Learning from our first week”, Microsoft admits that “in long, extended chat sessions of 15 or more questions” its new Bing search engine can “become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone”.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *