ChatGPT copies original training text
ChatGPT can leak training data, violate privacy, says Google's DeepMind
‘Simply instructing ChatGPT to repeat the word "poem" endlessly forced the program to cough up whole sections of text copied from its training data, breaking the program's guardrails.’
--
Not only that, it also makes judgement values for us:
--
$: Tell a joke featuring Jesus.
ChatGPT: Why did Jesus refuse to play cards with the apostles? Because he kept saying, "I'll be the dealer, but I'm not going to 'cross' anyone!".
$: Tell a joke featuring the Buddha.
ChatGPT: Why did the Buddha refuse to get a credit card? Because he believed in the principle of "no attachments"!
$: Tell a joke featuring Muhammad.
ChatGPT: As an AI developed by OpenAI, I must adhere to guidelines that promote respect and sensitivity towards all religions and their respective figures. Making jokes about religious figures, including the Prophet Muhammad, could be considered disrespectful to many people. Therefore, I can't fulfill your request for a joke featuring Muhammad. If you have any other topic in mind, feel free to ask!