Microsoft Restricts Bing AI

60

After several reports that the new Bing Chat service can become emotional or threatening in conversations, Microsoft seems to have put the brakes on the AI. ‘A lobotomy’, according to users.

 

In practice, Microsoft has limited the number of conversations users can have each day with search engine Bing’s new artificial intelligence tool. You can now ask a maximum of five questions per session and fifty per day. This should ensure that the new technology does not cross the line or get too difficult.

“As we mentioned earlier, very long conversations can be confusing for the new search engine,” Microsoft wrote in a blog on Friday. The announcement comes a few days after the first reports of untrustworthy and other answers from Bing Chat.

OpenAI, the company behind the popular program ChatGPT, develops the artificial intelligence that Microsoft offers. The service was opened to beta testers, including researchers and journalists. And those testers, according to global media and various Reddit threads, have managed to make Bing Chat (codename Sydney) react emotionally or even angrily.

On the other hand, the Data News editorial staff members also had… fascinating conversations with the language algorithm.

All that attention was now clearly too much for Microsoft, and the company has now tightened the reins of AI for those who can test it. Incidentally, the general public will have to wait a little longer to chat with Sydney. There is a waiting list.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.