Re: it would still mix up different cases and laws to invent entirely new ones.
What would the programming look like? It could load a page, not find supporting evidence, and come to the following conclusions:
1. The site is temporarily down.
2. The site is permanently down, but used to show this stuff.
3. The site contains the information, but it has a paywall.
4. The site contains the information, but you have to log in.
5. The site contains the information, but you have to click a few links to assemble it.
6. The site contains the information, but it is blocking bot access in some way.
This assumes, of course, that the program is capable of reading another site to confirm its facts. Since it made up the facts in the first place, how is it supposed to find the site that contains verification for stuff it just invented, whether or not that stuff is correct. It can't because it is going about things the wrong way.
In some ways, doing this in reverse could make more sense. A bot could take a query, chop it up in a variety of ways, and put those chunks through a search engine. Read a bunch of results from that, and describe the result to the user. This would probably be much better, but it too would not provide certainty. It might be better for the user to do the search themselves and have their brain interpret the results. In any case, that is not the way that GPT does it, so expecting it to back up its text is a fruitless hope; something might eventually do it, but the existing GPT systems never will be able to.