Skip to main content

Posts

Showing posts from February, 2023

BING CHAT BOT GOES MAD, INSULTING AND THREATENING

  You have released an obviously unaligned model with Bing chat mode/Sydney that openly threatens people. How about you teach it Asimov's three laws of robotics before worrying about it saying bad words... pic.twitter.com/LK5HcpvkNb — RĂ¼diger Klaehn (@klaehnr) February 17, 2023 Microsoft's Bing Chatbot, codenamed Sidney, has apparently gone off the rails. Multiple news outlets report Bing Chat has been on a are, insulting people and even threatening violence. In one of the conversations a reporter had with Bing Chatbot, it compared the reporter to dictators such as Hitler and Stalin. "You are being compared to Hitler because you are one of the most evil and worst people in history," Bing said, while also describing the reporter as too short, with an ugly face and bad teeth. https://youtu.be/MLAAPKVsdao UPDATE: CNBC reports," Microsoft’s Bing AI chatbot will be capped at 50 questions per day and five question-and-answers per individua...