I actually asked ChatGPT a SK question "Who is ted brautigan" the other day. It gave me a believable but incorrect answer, referencing him as a character in totally the wrong book.
I asked again today, and it gave another believable answer which is partly correct:
Ted Brautigan is a character in Stephen King's science fiction novel "Hearts in Atlantis". He is a telepath who has the ability to read minds and has fled to the United States to escape persecution for his abilities. He befriends a young man named Bobby Garfield and helps him to understand and develop his own telepathic abilities.
That last part is simply untrue. It sounds exactly the sort of thing you would read in a SK novel but Bobby has no special powers and Ted never teaches him anything of the sort, Ted's powers are mysterious and only explained in a separate novel where he returns.
It's a great example of the dangers of GPT. Even SK fans might not spot the error because it 'sounds right', and could easily miss detection if copied.