Shame to feckbook!
We humans are all primates!
Facebook has apologized for an "an unacceptable error" after its AI systems asked folks who watched a British video about a Black man if they wanted to view more content on "primates." A former Facebook employee spotted the prompt and reported it, and the biz said it was "looking into the root cause." "As we have said, while …
Well, even mentioning the subject (or any subject, in fact / in shame) in any shameful context is shameful. As a matter of fact, even mentioning how shameful it is is shameful (and shocking)! Shame on you then, and shame on myself too! Shame, shame, shame!
There, have I shamed myself enough, or should I keep shaming, just in case, to safely overshoot the ultimate shame frontier of any shameless human on earth (and not forgetting those in spaaaace) by a sufficiently shameful margin? What next? Shameful to say women have titties, and men are stupid? Shame on me and my children! And my children's children!
Insemination.
About four years ago my large animal Vet came in with a funny bit of advertising. This guy's in his second career, he became a Vet after 25 years as a DBA working for IBM. He knows I'm a computer guy, and thought I'd be amused. The ad was for a large animal veterinary practice management software package "NOW WITH AI!!!"
The Vet was laughing, and wondered how many times the company in question got Vets inquiring about their new Artificial Insemination package. Without a pause, I dialed the 800 number ... the answer was over 80% of calls! The guy on the other end wasn't amused when I suggested they fire their marketing genius and hire an AI expert ...
Isn't the point that the algorithms might not have bias explicitly coded into them, but the huge amounts of data they operate on will certainly show existing biases within society, and the algorithms will just then "learn" those biases and automatically adopt them?
Very convenient that you then can blame the AI : if you start from the assumption that "neutral = good" then you are erroneously thinking that the training data contains just data goodness to be extracted, rather than realising that there can be a reinforcement effect of learning very undesirable behaviour for a computer system that is meant to be unbiased.
If it's based on machine learning, it's not an algorithm and it's impossible to know what's "baked into" the process.
We have a series of complex algorithms that basically look at a variety of vacuous potential responses when asked for a press statement and pick one that matches at least 3 keywords in the request said spokesdroid, Eliza Turingbot.
Both senses of the word are derived from the Latin "Primus", meaning "of the first rank, chief, or principal" ... The reasoning behind naming the Church boss is obvious. Ihe Human version comes from the rather Victorian concept that humans are the highest order of mammals.
Presumably the people bitching about the Facebook AI are racists who don't think that black people can be first at anything, much less a higher order animal.
And here I was, under the impression that a Primate was a kind of super Archbishop https://en.wikipedia.org/wiki/Primate_(bishop) and tended to roam cathedrals. And that there’s at least one black one (in Nigeria, last I looked; he’s also a Cardinal, and was one of the candidates defeated by the current El Papa, possibly because he’s somewhere to the right of the previous El Papa, a.k.a. Der Panzerkardinal.
Devil's advocate for a sec...
Without wishing to defend Facebook, this could put an entirely different slant on things.
We're missing some key details:
1) What kind of "primates" were being suggested, and...
2) was the offended chap watching something specifically about prominent black religious figures, perhaps tagged correctly as "Primate", and...
3) was the follow-up suggestion simply text "primates" or was it images of prominent religious figures?
Say he doesn't know the eccumenical aspects of the term and, let's be realistic, I suspect many people don't; it is really easy to jump instantly to the wrong conclusion.
On the other hand, this is Facebook, so Occam's Razor etc.
From the NY Times article:
"The video, dated June 27, 2020, was by The Daily Mail and featured clips of Black men in altercations with white civilians and police officers. It had no connection to monkeys or primates."
Of course it could be that Facebook's algorithms have worked out that viewers of Daily Mail videos have more in common with monkeys than the general populace. I'm not convinced that many of them aspire to be archbishops.
It makes for a good headline in the paper.
Besides, if you've ever read that overpriced nappy filler, you'd be aware that "science" isn't their strength. They were going for the "big powerful corporate AI calls black people monkeys!". Don't question it, just be outraged over it. Manufactured outrage sells papers...
search algorithm?
didnt know it had one apart from "look at what cookies it can on the users computer and then spam ads for whatever was searched for"
Hence the likes of me getting 3 months of washing machine ads just after searching for and buying a new one.
"He's just got 1 washing machine... hes bound to want 25 more... by different suppliers.... and spam those ads for washing powder/pods too"
I'd love to see what adverts were then targetted at users identified as primates.
Tired of throwing your own excrement?
Is flinging your poo at passing pedestrians getting you down?
Well fret no more! With Robinson's new Hurl-o-matic!
Automatically distribute your own faeces at speeds of up to 50mph, in any direction you choose!