Re: So how do you stop this?
It might help people whose job is to write things, though for the moment they might want to be really careful about fact checking the output, since the models tend to hallucinate.
Aside from outright hallucination, they can give some very odd/bland output even when they're accurate. This article on a local news site reports an abnormal load moving out of the local GE factory who make grid-scale electrical gear - transformers and such.
The opening two lines read:
The officers from the Staffordshire Police Roads Policing Unit were working to help the abnormally large load, which consisted of a metal container and larger metal supports, through the streets of Stafford on Saturday.
The jumbo machine bears the livery of heavy load and transport company Allelys.
Now, it strikes me that this is very generic, even by the low standards of modern local journalism (where one journo is covering two counties). A real human writer would have been able to say what the "metal container" was, because they'd have asked someone or bobbed an email to GE press relations. They'd be able to say "the oil-filled transformer weighs around x tonnes and is bound for upgrade works to the grid in destination".
The second line about the livery... who cares about the livery? A more journalistic line might be "The move was enabled by heavy transport specialists Allelys".
I can't say for sure, but it feels very like AI generated output, possible with some image-to-text "describe this photo". I suppose that's a matter of training and finessing. AI could be useful for the "boilerplate" of journalism and getting words out, albeit with full human oversight and fact-checking. But not if it needs a 100% rewrite because it's playing "say what you see".