Colour me surprised
Chatbot spouting toxic rubbish. Who would have thought that might happen?
The National Eating Disorders Association (NEDA) has taken down its Tessa chatbot for giving out bad advice to people. In a now-viral post, Sharon Maxwell said Tessa's advice for safely recovering from an eating disorder directly opposed medical guidance. The American non-profit's bot recommended Maxwell count calories, weigh …
NPR obtained leaked recordings of the firing of the newly unionised workers, which Vice ran with to better publicise the story. NEDA soft launched Tessa in May, then the story went fully public. Clearly people have hammered on Tessa, exploiting its weaknesses to prove it isn’t actually fit for purpose in response. Someone needs to explain to the suits that AI can only ever supplement existing staff (reducing future costs as the organisation scales) and can never replace them and that any attempts to do so will likely result in novel attacks on their systems. After all, trolling public access AI systems which learn from end users is the very opposite of “securing unauthorised access” and is not covered by CMA nor CFAA.
HQ Plano, TX.
NEDO has not published a yearly financial report since 2019.
CEO Elizabeth Thompson, also seems to be CEO of National Osteoporosis Foundation.
Her Linked in page - Transcend4Good is committed to the growth and innovation of nonprofit organizations and associations. We are organizational strategists, mission innovators, revenue and development professionals, and positive guides through change. Passionate, inspirational experts ---we are here to coach, guide, or advise boards, c-suite leaders, and coalitions to achieve success!
Well, if AI doesn't work out, they can always outsource the helpline to India and the Philippines - much like IBM will be outsourcing their IT work.
Yeah, it still amazes my how many people volunteer, free or charge, to help charities, especially retired people helping out in charity shops. And yet all these major CEO type and other c-suite types who we regularly hear of retiring earl;y as young as 50 or so, never seem to offer their skills and experience as voluntary CEOs or board members for charities. You know, exactly the sort of people who not only have the skills, but have the time. There's a even a case to be made for those high flying CEOs on £200k plus to maybe take a year out and do this. It'd look good on their CVs, gain them moral capital and at those sort of salaries, could afford it.
The people who *actually* volunteer are mainly young people looking for experience to help them get a [better[ job or retired middle-class people, neither of which groups you'd class as well off.
Just taking a guess here, but I these seems like it would be sensible device if your eating disorder involved OVER eating. I wonder if it just didn't ask (or got the wrong impression) and went for that. If you're dealing with mental health, using an AI chatbot seems like a seriously bad idea.
- an eating disorder characterized by regular, often secretive bouts of overeating followed by self-induced vomiting or purging, strict dieting, or extreme exercise, associated with persistent and excessive concern with body weight.
- an eating disorder in which a large quantity of food is consumed in a short period of time, often followed by feelings of guilt or shame.
late Middle English (as bolisme, later bulimy ): modern Latin, or from medieval Latin bolismos, from Greek boulimia ‘ravenous hunger’, from bous ‘ox’ + limos ‘hunger’.
I strongly doubt any of the "advice" quoted in the article would be useful for any overeating condition. It's not like people don't understand the relationship between consumption and weight gain. Obesity has a wide range of causes, and – from what I've read – calorie-counting and body-measuring are rarely helpful, at least in isolation. (Counting calories is particularly misleading because human metabolic efficiency is very sensitive to diet and activity. People aren't furnaces.)
I have sympathy for people who find it difficult to achieve a healthy weight (whatever that might be). I've stayed more or less the same weight my whole adult life, but that's certainly not because of any virtue I can claim. I don't police my diet significantly and I don't exercise for the sake of exercise. There are no doubt various hereditary, environmental, and economic factors at work, but no "willpower" or "discipline" or "smart lifestyle choices". It's just luck, and I could have just as easily found myself on the other side of that coin flip.
No one -- and especially no one calling an eating disorder hotline -- is going to find weight-loss revelation in 'eat less, move more.' We ALL know that. If it's not strictly medical and you can't summon up herculean levels of motivation, you need a coach, a therapist, and as many supportive friends and family as you can get to keep propelling you forward, not some banal tautology. (Or a serious hard drug habit.)
All those hollywood stars and CEOs who shed 50 pounds and get super buff for a role? They sure as hell didn't do it alone, or after being nagged by a chatbot or horrible mother-in-law.
What exactly is the point of organisations like this if they can't even be arsed to provide a proper helpline for those who they claim to be there to help?
The idea that a chatbot is in any way appropriate for any sort of mental health related helpline is ridiculous. Many of the people contacting them are likely to have really struggled to get as far as even contacting them, and to then be fobbed off with a chatbot could just make things worse.
They're firing staff because they have too *many* calls? Most nonprofits would try to expand their services and seek more funding, pointing to increased usage, which is considered a success by funders. I would think having lots of callers would make their helpline staff more essential instead of less so.
In reality these services should be government-funded anyway and not left to the vicissitudes of the nonprofit world.