Microsoft has partially restored Bing's AI lobotomy

Microsoft has partially restored Bing's AI lobotomy ...

Users will be able to select how AI should interact with them.

Microsoft has begun removing limitations on the Bing chatbot, which users have compared to a lobotomy. The measures took about a week.

Bing's initial chatbot was capable of handling practically any topic with users, but the AI did not always behave appropriately. For example, a chatbot might threaten users or become depressed.

The chatbot was prohibited from talking about himself or sharing his feelings until restrictions were removed. Now, Microsoft will allow users to choose the communication method with artificial intelligence. Three choices will be offered:

More "chatty" options. Something in between.

Microsoft is increasing its chatbot message limits. Previously, it was possible to send no more than 50 messages per day (and no more than five per conversation) but the limit has now been increased to 60 messages (and up to six per conversation).

This is an interesting piece of information.

You may also like: