Top thirteen
telephone salespersons: I would far prefer a DTMF tree to risking whatever order results from communicating with an LLM. It is possible an LLM can hallucinate product features better than a human can lie about them.
solicitors: Been tried and found negligent.
psychologists: Who wants to get sectioned based on the opinions of three LLMs?
further education teaching professionals: Ask a novel question, get back a hallucination.
market and street traders and assistants: Please do not add LLMs to vending machines. Without the physical security of an armored vending machine, LLM street traders will be robbed faster than humans.
legal professionals: try telling a judge "ChatGPT said it was legal".
credit controllers: Let's have LLM advocates put their money into this first.
HR admin roles: consider the training data.
PR professionals, management consultants and business analysts: OK
market research interviewers: get enough silly responses already. If the interviewee thinks they are talking to a machine there will be a bigger chance of testing limits of credulity.
local government administrative occupations: I have received good service so far, but I write to them and make sure they have all the documents they require. The potential speed with which an LLM can respond creates an opportunity to search for exploits.
LLM is OK when the consequences of failure are trivial or land on other people.