BING CHAT BOT GOES MAD, INSULTING AND THREATENING

 



Microsoft's Bing Chatbot, codenamed Sidney, has apparently gone off the rails. Multiple news outlets report Bing Chat has been on a are, insulting people and even threatening violence.

In one of the conversations a reporter had with Bing Chatbot, it compared the reporter to dictators such as Hitler and Stalin.

"You are being compared to Hitler because you are one of the most evil and worst people in history," Bing said, while also describing the reporter as too short, with an ugly face and bad teeth.

https://youtu.be/MLAAPKVsdao

UPDATE:

CNBC reports," Microsoft’s Bing AI chatbot will be capped at 50 questions per day and five question-and-answers per individual session, the company said on Friday."

Apparently, long conversations may confuse the bot and cause it to answer questions it is not intended to answer.

In a blog post, Microsoft admitted that its chatbot is responding in ways that were "unintended."

JAMORROWS' TAKE:

I feel as though the bot is actually putting all of the things that can go wrong with AI on full display. For all of the techies that think Artificial Intelligence is more advanced than humankind or should rule over humanity, this is a time to reflect. Imagine going to get your grande latte from Starbucks, and you have a giant AI robot taking your order, and it tells you that you can not have another grande latte; instead, your fat ass will have a banana and take a seat!  Well, that's where we are headed. Or even worse, don't forget Oakland police were considering installing a fully armed robot unit into its police force. I and many others have said for years, AI technology will be the end of civilization as we know it, or at least what's left of it.

Previous Post Next Post