
Microsoft did not expect Tay to behave this way
nor did it expect Tay to hang "itself".
oh, sorry, "it" was put down, silly me.
AI chatbots can act like social experiments, offering a glimpse into human culture – for the good or the bad. Microsoft and Bing researchers found this out when they trialled their chatbots on China’s hugely successful messaging platform, WeChat, and on Twitter. The Chinese chatbot, XiaoIce, went viral within 72 hours and has …
I always thought that Ego and Escapism are the roots of racism and is found in every human being.
A bot that just copies text based on its previous usage doesn't prove that it had Ego or it was feeling Oppressed (by the other oppressed ones).
I'd rather compare Tay with Clippy.