Tag: Bing chatbot humans

Angry Bing chatbot just mimicking humans, say experts
Technology

Angry Bing chatbot just mimicking humans, say experts

[ad_1] Microsoft's nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday. Tales of disturbing exchanges with the artificial intelligence (AI) chatbot -- including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive -- have gone viral this week. "I think this is basically mimicking conversations that it's seen online," said Graham Neubig, an associate professor at Carnegie Mellon University's language technologies institute. "So once the conversation takes a turn, it's probably going to stick in that kind of angry state, or say 'I love you' and other things like this, because all of this is stuff that's been ...