If they want a flawed product, that's on them
Terminator...
Workers tasked with improving the output of Google's Bard chatbot say they've been told to focus on working fast at the expense of quality. Bard sometimes generates inaccurate information simply because there isn't enough time for these fact checkers to verify the software's output, one of those workers told The Register. …
Google's origin and raison d'etre as a search engine seems to have been turned upside down. It used to be that if I type "what are the side effects of drug X?", or "who is Mr Y?", I would get back multiple links to source material that I can easily find. If necessary I can quickly cross-check multiple sources. Inserting an LLM in between is of zero utility if I anyway have to crosscheck the output with a different source. Even worse if the LLM is unable to direct me to the source material (which it can't ever do because of how it works).
Bard, Chat-GPT etc etc are also pre-trained, meaning they are immediately out of date (therefore useless on current or recent events), and require gigantic amounts of processing power to deliver a search result that can be generated much more easily by a search engine's indexed search. While LLMs could be useful for generating (bland, grammatically correct but possibly inaccurate) sections of text, they are pretty useless as a search engine replacement.
"While LLMs could be useful for generating (bland, grammatically correct but possibly inaccurate) sections of text, they are pretty useless as a search engine replacement."
This! chatGPT is great for turning out outlines that you might then edit - much easier than writing it all from scratch, sometimes it comes up with angles I haven't thought of, even. In its own way it is very useful.
But it isn't a search engine, and I can't see the point in trying to use it as one.
One of the quality issues is due to PageRank itself. Early days people used to link to quality pages. Then SEO happened and shit hit the fan with scammers occupying the top results. Also search engines consider most links as likes, which is incorrect. For example some news sites link to disinformation sites just for reference, but this pushes them up in the search results. There should be a "dislike" link type (a href). But this would be too complicated.
It really sucks that our best bets for making the internet not suck again is having curated lists of sites, not unlike what "the internet" looked like back in '97, with portals such as Yahoo.
The problem, of course, is that such lists are bound to suffer enshitfication on the long term, because sooner or later, the place will be sold off or run by an a-hole who only cares about profits.
Time, money, quality – if you admit which one you're giving up and work for it, you can have any two. If you don't take this active point, you can have any one. I have seen way too many businesses that could have had two except they weren't very good at prioritizing, so they only got one of them. For example, they could have opted for money and time, but they spent a lot of time talking about but not getting quality, so they ended up with only money. It didn't always end well.
Humans at work? What an interesting thing to try and copy. How do you work in coffee breaks , smoko brakes, staring out the window time, watercooler chates, windging about the boss, wife, traffic, time till knock off, week end,oh and toilet breaks, food brakes, and then there are the meetings, time spent in the bosses office, also Time when he's there, there's so much more that humans do that is non productive (but required for creativeity) AI seems to have creativity down pat, all we have to do is add errors and omissions, and a 10000% slowdown, oh and a union.