Wednesday, October 30, 2024
Apps

Microsoft admits long conversations with Bing’s ChatGPT mode can send it haywire


Microsoft’s new ChatGPT-powered Bing has gone haywire on several occasions during the week since it launched – and the tech giant has now explained why.

In a blog post (opens in new tab) titled “Learning from our first week”, Microsoft admits that “in long, extended chat sessions of 15 or more questions” its new Bing search engine can “become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone”.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.