ED 209
Please put down your weapon, you have 20 seconds to comply...
But I'm a doctor, and this is a temperature gun.
You now have 15 seconds to comply...
Here's your latest summary of recent machine-learning developments. Google Cloud’s computer vision algos are accused of being biased: An experiment probing Google’s commercial image recognition models, via its Vision API, revealed the possibly biased nature of its training data. Algorithm Watch fed an image of someone with …
It is probably more that visible light is biased against black people. And dogs. And cars. And … All of these things are harder to photograph well. So the average photograph generally isn't very good. And if you are using average photographs to train your AI you will get worse results with black people. And dogs. And cars. And....
A friends dog is biased against black dogs (he is a white Staffordshire terrier), other dogs no issue, black dogs doesn’t like.
They reckon is because is it harder to read the intent of a black dog as opposed to a lighter colour fur, presumably this is down to when you see the teeth you can’t tell if it is yawning, having its tongue hang out or getting ready to fight, and you can’t see the eyes against the fur....
So perhaps the data set was trained using dogs.....
If Chocolate Factory is using snap shots from Instagram and the like they are truly the idiots. Most people do not know how to operate a camera and thus use whatever settings the camera/phone selects. Also most do not use a quality camera that has very good resolution and lens ever. Up to point image quality is affected by the quality of the device (sensor size, lens, etc.) as well as the skill of the user. I have not addressed lighting conditions and how to try to compensate for them.
To properly 'train' the artificial idiocy system you need high quality portraits taken from different angles of a large number of people. This costs real money. The reason they sort of get away with it on lighter skinned people is there is naturally better contrast in the face than with darker skinned people. Basically they got lucky. But the poor quality of the original images means more error in the system. Depending on what you need the system to do, the error could be unacceptable.
Rear view biometric access has already been installed in certain secure facilities for years.
Why does my arse related info need to be stored in the cloud? I'm guessing this Internet Of Shit device is going to be very expensive and contain expensive sensors and some significant processing power in-situ (see what I did there?) so how hard is it to have a some on-board storage (ooo-errr) like an SD card and maybe a simple webserver to serve up your business to you on demand?
A toilet is not exactly a friendly environment to most things, the less you have to put in there the better. And outside the khazi itself is the bathroom, which is not much better from the perspective of electronics.
What bothers me is the thought of what it will take to keep it clean.
It seems to be that the researchers are showing their own bias here.
My (obviously less politically motivated) take is that the model more accurately identified what was actually a gun (albeit a temperature one) when held by someone with dark skin.
So perhaps the headline should actually say, "Model more accurate when dark skinned person in the scene".
However, if we see signs of COVID-19 pneumonia on chest x-ray, which may be picked up by the AI algorithm, we may decide to test patients with RT-PCR who have not yet been tested, or re-test patients who have had a negative RT-PCR test already. Some patients have required 4 or more RT-PCR tests before they ultimately turn positive, even when x-ray or CT already show findings
I can't help but suspect a certain amount of corona-wagon jumping will be going on at the minute. Patient turns up in the middle of a pandemic with breathing difficulties, they get tested. Maybe you retest a few times due to false negatives, particularly if the X-ray shows pneumonia (as the quote suggests happens already), what having a ML widget to indicate (non-specific) pneumonia in that case adds is not clear to me. Seems like the right answer would be more tests and more accurate tests.
Many years ago - I was young and clever and thought I would be given the red carpet - didn't get past the front desk.
Whatever.
Being generous by nature, I decided to contribute, so visited their "facilities".
When in the proper enclosure and assuming the position, there was this label at eye level The cameras are for research purposes only.
At the time I thought it was just a cute sophomoric prank, go around the toilets with stickers, make lower life forms feel insecure. (I know the feeling, I mean, of making others feel insecure, like when I get nude to change in a swimming pool locker.)
Later on I learned that some MIT people are subject to raaather flexible ethics requirements when doing research in Africa, but that's another story.