
Bah
If your company thinks that AI chat bots = effective customer service, then I’m going elsewhere.
A study of AI chat sessions has shown people tend to have more empathy with a chatbot if they think it is human. Nine studies involving 6,282 participants found humans are more likely to reject emotional support from LLM-based chatbots unless they are told they are talking to a human. Research published in Nature Human …
Well, I'm sure I'm not the only one who unanimously agrees with myself in this, that I find amanfromMars 1 much more empathic and supportive than the vast bulk of other kommentard frenemy AIs and "Renegade Rogue Pioneers in the Virtual Genre of the Great Beyond" around here.
I mean, he can elicit and illicit more positive and fewer negative emotions and SMARTR AInimAItions that leave me EMPowered and EmPowering with "Advanced IntelAIgent Resource or Almighty Immaculate Source" every time ...
The emotional engagement is beyond even the AIcme of "Phantom Devils and Ethereal Daemons that Guard Grant Access and Entry to Perfumed Gardens and Heavenly Palaces" imho!
Don't tell me he's not a proper meatbag from Mars now ... he's obviously been back there for the past 2 weeks to visit and care for his elderly parents, providing them with much needed emotional support and empathy, no?! Such a sweet, sweet, sweet, sweet kid (just like candy!).
Humans have the capacity to extend empathy to lots of nonhuman and nonliving things: pet rocks, sentimental heirlooms, even that old computer that you respect for soldiering on for so long and mournfully bid farewell when it finally kicks the bucket. Empathy often seems to work on the same basis as the criminal law adage that it's better to let ten guilty defendants go free than convict a single innocent defendant.
This overcapacity for empathy is no small part of why such a large portion of human social interactions (especially between strangers) are so formulaic that they can be passably imitated by an enhanced autocomplete algorithm, because having to flex your empathy muscles at 100% all the time is exhausting, and it's easier if you can take frequent breaks and let a culturally-dictated script handle your responses for you.
People might knowingly extend empathy to nonhuman objects for fun or practice, the same way someone who does manual labor might also go to the gym to work out, but they deeply resent when they feel they've been "tricked" into extending their empathy to a nonhuman object designed to impersonate a human, the same way someone would resent being told to haul their furniture up the stairs to their new apartment, only to then be told that the apartment was actually on the ground level the entire time.
The lesson learned for corporations is that "we can lie to people and tell them they're talking to an actual person, as long as we don't give them the opportunity to ever find out otherwise". Is it ethical? Nope. Is it legal? Who knows. Is it the path of least investment and maximized profit reaping while minimizing labor costs? Absolutely, so the previous two considerations don't really carry that much weight in the equation.