Re: Racial Bias and Intent
Maybe I should have posted "Many AI systems exhibit the unintended and unrealised racial bias of their programmers", or possibly "Many AI systems exhibit the unintended racial bias of the data sets with which they are trained."
For example, a post above states that the police in the UK need reasonable cause to stop and search someone. Which is true, however, black people driving 'nice' cars (Mercedes, Jaguars, BMW's etc.) are more likely to be stopped than white people driving the same vehicles (see several recent examples in the UK press). (There is no crime of 'Driving while Black' on the UK statute book.)
Similarly, if criminality is evenly spread across the population irrespective of skin colour, but one section of society determined by skin colour is more likely to be stopped and prosecuted, more crimes will be recorded against that section of society and so they will get an unfair reputation for criminality, and so be more likely to be targeted than otherwise, leading to more detection of crime in a vicious circle.
I'm sure that the poster above (sorry didn't write down the tag name) does not consider him/her self to be racist, but the attitude presented that finding more people of a specific skin colour carry knives so searching more of 'them' will find more knives and therefore detect more crime, lends itself to exhibiting unintentional racial discrimination.
I hope this explains my original comment concerning racial bIas.
(It's OK, I know I'm at risk of major downvoting.)