back to article Prejudiced humans = prejudiced algorithms, and it's not an easy fix

It is half a century since the days when London B&Bs would welcome guests with notices stating "No Irish, no blacks, no dogs." Twenty years since, as marketing manager for a major UK financial institution, I had to inform our underwriting department that adding "does not apply to disabled persons" to an ad for motor insurance …

  1. Voland's right hand Silver badge

    There is an inherent difficulty in this

    Paraphrasing an old Russian (and of course discriminatory) joke about a blond driving:

    ML is a like a monkey with a hand-grenade - you never know where it is going to throw it.

    This is an inherent problem with ML versus other approaches like optimal control, probability and statistics, etc. When you are using ML you are "guessing" the model which describes the process. You are not fitting data to a proposed model. So you have no clue what empirical model will the NN converge to.

    So if it comes up with a silly answer, well, it comes with the territory.

    Same as we have when we were little. Just listen to your children for a while. Or search for "kid's humor" on Google.

    1. Anonymous Coward
      Anonymous Coward

      Re: There is an inherent difficulty in this

      Yep. There is a reason gaming utilises the GPU for graphics, and very rarely does Excel and accountants. ;)

      One is hard coding for exact numbers, the other just gets any old answer out at 60 fps.

    2. The Man Who Fell To Earth Silver badge
      Devil

      Prejudiced humans = prejudiced algorithms

      So, is this why Bender's dream is to kill all humans?

  2. PushF12
    Big Brother

    Orwellian redefinition of "predjudice"

    The SJWs are commanding us to adulterate statistical reports and tilt risk analysis models to benefit people that don't deserve it and haven't earned it.

    1. Is It Me

      Re: Orwellian redefinition of "predjudice"

      Putting SJW in anything shouts Alt-Right, and from a my point of view (and a lot of other peoples) immediately invalidates anything that is said after.

      1. Libertarian Voice

        Re: Orwellian redefinition of "predjudice"

        I think along similar lines about people who use the perverse term "social responsibility". And I am Libertarian which is far removed from both the alt-right and the oxymoron that is antifa (the fascist left).

      2. Suricou Raven

        Re: Orwellian redefinition of "predjudice"

        Putting aside political insults, it does describe a certain tension. ML engineers are interested in developing the most accurate statistical models possible, but sometimes accuracy must be compromised for social or legal reasons. If you machine learning engine crunches numbers and decrees 'arrest all the black suspects' or 'stop treating cancer patients over 80, they don't have enough years left to justify the expense,' then it might very well be giving statistically valid conclusions - but people are still going to be very upset if those acted upon.

        Simple conclusions like that can be ignored, but the bias may not always be so obvious. When your program for estimating car insurance premiums is a black box made by a ML process, it can be difficult to determine if it is incorporating gender or race into the model. Even if you exclude that information on the input, it can be inferred from other things.

  3. iron Silver badge

    "So offering a job only to someone over six foot tall... clearly discriminates against women."

    No it doesn't. It clearly discriminates against short and average height people no matter what gender they are or wish to be recognised as. There are women who are over 6 foot tall so it does not discriminate against women.

    Offering a job only to people with a penis, THAT clearly discriminates against women.

    1. Anonymous Coward
      Anonymous Coward

      Wanted for Immediate start:

      Life model for our new patented "Wang Warmer" for sale in catalogues and online.

      Must provide own Wang.

      1. The Nazz

        I can't see this woman applying for that job ...

        http://www.bbc.co.uk/news/entertainment-arts-41107089

    2. big_D Silver badge
      Facepalm

      There was a case in Germany recently, where a woman was 1cm too short for joining the police force. She sued for discrimination. She didn't get the response she was hoping for.

      The German force in question had something like a 15cm height difference between men and women (i.e. women could be 15cm shorter than men).

      The court came to the conclusion, that yes, the height requirement was arbitrary, but was lawful. What was not lawful was having different height requirements for men and women.

    3. Anonymous Coward
      Anonymous Coward

      "So offering a job only to someone over six foot tall... clearly discriminates against women."

      No it doesn't.

      It certainly doesn't. I'm six foot, and I saw an absolute belter of a bird in Tesco the other day, easily three inches taller than me. I'd have taken her home, other than for the fact that she'd have easily flattened me (and if she hadn't the much shorter wife most certainly would, as prelude to separating me from my knackers). But its nice to dream.

    4. bombastic bob Silver badge
      Devil

      " It clearly discriminates against short and average height people"

      cue Randy Newman singing "Short People"

    5. cream wobbly

      "So offering a job only to someone over six foot tall... clearly discriminates against women."

      No it doesn't.

      ----

      The average height of women is less than the average height of men. You make a requirement for people "only over 6' tall" and you immediately disqualify more women than men.

      Here's a picture for you

      Look at the difference in population at 182 cm (6 foot) -- for men, you're at about the 60th percentile. You have a huge population of men to choose from -- for women, you're in the weeds, somewhere around the 95th or 98th percentile. This is simple statistics. To claim it's non-discriminatory is a blatant display of ignorance.

      1. werdsmith Silver badge

        "So offering a job only to someone over six foot tall...

        The "offering" part is way down the process. A job can be advertised as available to anyone, but a person considering the candidates can throw away an CV that they don't like and make their own choice of rejection from a shortlist of candidates whilst giving any plausible criteria for the reason. There is not much that any legislation can do about that, so in the end the law is ineffective.

        If a recruiter wants a young team the recruiter can reject the older folks at interview because they "want people who will fit in with the team".

        It's tough, but it's life.

    6. Phil Lord

      It discriminates against women because they are generally shorter than man, hence it is effectively a mechanism to discriminate against a protected characteristic -- your gender.

      It does not discriminate against short/tall people because that is not a protected characteristic. "Discrimination" in a legal sense does not just mean "distinguish" or "differentiate".

    7. Anonymous Coward
      Anonymous Coward

      Yes setting a blanket minimum height requirement will cause "indirect discrimination" against women because women are on average shorter than men. If you cannot justify the height, then you are likely to find yourself at the wrong end of a judgement from an employment tribunal. Conversely a blanket maximum height requirement (say for working in cramped conditions) is indirect discrimination against men.

      Further you would have to show that "reasonable adjustments" are not possible. That is it was unreasonable or possibly impossible to adjust the working environment to remove the minimum height requirement.

      Now you might think that "reasonable adjustments" is really wooly turn of phrase. Problem for you is that Employment Tribunal Judges have developed over the years a lot of case law on what "reasonable adjustments" actually mean.

      So want to put a requirement like that in the job description the sensible approach is talk to a specialist employment lawyer who will advise you if your requirement is going to fall down in front of the judge at the employment tribunal. Then if they say no remove the requirement from the job description. Of course HR are know for then completely ignoring the advice given and down the line wondering why they have just lost an employment tribunal case.

  4. tiggity Silver badge

    Given this "no discrimination on these characteristics" rule has been in effect since 1990 in UK , and (a long time after that date) the police refused to come to my location when a crime was reported late at night as (paraphrase as I cannot remember exact words) "that area was too dangerous to visit late at night, we only attend there at night if someones life is at risk"

    That area was predominately non "white", so can residents / ex-residents of there sue the relevant UK police force for their racial discrimination?

    .. irrespective of the inferred prejudice it was a ******* dangerous area to live, but those of us who lived there did not have the luxury of keeping away when it was darkt.

  5. Nick Z

    People don't fully understand how discrimination happens, which makes it hard to fix

    I'd say that a lot of discrimination nowadays happens because of good intentions.

    For example, the IT industry is well known for its use of social networking to hire and promote people. People are making friends with each other and then using their friendly ties to get ahead.

    It all sounds friendly and good, until you think about how this might affect women in a male-dominated industry. When a guy invites another guy for a cup of coffee at home and a friendly video game, then it's just that and nothing else. But if an attractive women invites a guy like that, then he'll probably bring a pocket-full of condoms with him, just in case.

    It's a lot easier for men to make friends with other men, than it is for women to make friends with men in a non-sexual way. Which means that using social networking for hiring and promotions in a male-dominated industry puts women at a disadvantage. Women are much more limited in what they can do in terms of networking.

    Artificial intelligence can't fix a problem that people themselves don't fully understand and talk about.

    1. Anonymous Coward
      Anonymous Coward

      Re: People don't fully understand how discrimination happens, which makes it hard to fix

      "It's a lot easier for men to make friends with other men, than it is for women to make friends with men in a non-sexual way. "

      For a subset of knuckle dragging men. I'm male (and hetero, in case you're wondering) and in a professional context I find women far easier to establish good working relationships with.

      1. Nick Z

        Re: People don't fully understand how discrimination happens, which makes it hard to fix

        A good working relationship is different from actually having her home phone number, email, and other ways of contacting her.

        Because that's the kind of relationship you need to keep in touch with each other, after one or both of you leave your jobs and are looking for other opportunities.

        That's what networking is all about, when it's being used to find better jobs with more pay and more responsibilities.

        Guys, who are friends, have no problem keeping in touch with each other like that. But a woman would need to break all kinds of conventions and norms to network like that with a guy. And that's what I'm talking about here. This is the disadvantage for women.

    2. bombastic bob Silver badge
      Devil

      Re: People don't fully understand how discrimination happens, which makes it hard to fix

      nice try, but I'm not buying.

      people NATURALLY discriminate. You don't want the street person that begs for change in front of your favorite shopping center as your best friend, right? That's an extreme case, but it makes my point.

      Men typically won't date women who are unattractive, unless they're feeling particularly undersexed and desperate. And we're willing to tolerate quite a bit for someone who's an 11 on a scale of 1-10. Yeah.

      Women typically won't date men who don't earn enough money. Admit it. There are exceptions, of course, but I'm referring the general case here. You ladies just don't want THAT GUY burdening you, being a "hanger on" or "clingy" or crashing on your couch for 5 years.

      Fat and unattractive people are regularly NOT hired in comparison to thin, nice looking people. It's a fact. This is why you always "dress for success" when interviewing for a job. (It also proves you have a good attitude, to dress for success, so that's a factor as well).

      People who smoke are discriminated against, too. I mean, the ONE SMOKER in the office, who stinks like cigarettes all of the time, is constantly taking "a smoke break", yotta yotta. It's just FREAKING IRRITATING and so THAT guy typically won't get hired, and may be paid LOWER because "all of those smoke breaks" aren't getting work done. Yes, they're less productive, therefore not worth the higher wage that a typical non-smoker would earn.

      Anyway, it's just human nature to discriminate. Not justifying outright racial or religious or sex-based discrimination, of course. Just pointing out that maybe it's not such a bad thing, to discriminate within a reasonable tolerance.

      I was once asked to interview people for a job as an embedded developer. I asked one simple question: after identifying a problem I'd seen, with no obvious solution, I asked candidates how they'd approach it. The guy that was hired just said "I'd connect a JTAG or other debugger..." and that was the answer I was looking for. But one person applying, who happened to be a woman, seemed to talk a good game but couldn't come up with an answer like THAT one. I suspect if she'd been hired, regardless of her impressive pedegree, she'd have wheel-spun over otherwise solvable problems for MONTHS, making up excuses for not getting things done, etc. etc. and trying to fire her just might have invited the SJW brigade to claim "it's because she's a woman".

      So yeah it wasn't because she was a woman that I wanted the other guy instead, it's because she probably wasn't someone who actually gets things DONE. Amazing how fast you can reveal that, too, if you ask the right questions [and my pedigree, compared to hers, is pathetic, so I guess pedigrees aren't worth the paper they're written on in the world of embedded systems engineering].

      No doubt she probably continues to work for a large company where she can continue being 'somewhat mediocre' and yet fulfill the HR requirements of having more female employees (and thus stave off the blood-sucking lawyers and SJW activists).

      1. Nick Z

        Re: people NATURALLY discriminate

        Perhaps the word 'discriminate' isn't quite the right word to use, when you talk about choosing a friend or choosing someone to have a relationship with.

        Because this isn't discrimination. This is just your personal choice.

        Discrimination happens only when people at work use their friendships to favor some people over others in hiring and promotions.

        The more people have in common with each other, the easier it is for them to relate to each other and be friends. Which in the IT industry means that all the white guys are friends with each other, who are helping each other, while women and minorities get excluded.

        And I'm not talking here about having a good working relationship. Sure, white guys usually have good working relationships with women. But they are less likely to be friends outside of work. And that's where it makes a difference whom the guy will recommend for a job or for a promotion. When people do favors for each other, then they do it for real friends and not for mere acquaintances.

        1. Intractable Potsherd

          Re: people NATURALLY discriminate

          Perhaps the word 'discriminate' isn't quite the right word to use... "

          Actually, it is exactly the right word. What is being referred to in the article is *unfair* discrimination. We all discriminate with regard to people all the time - my experiences mean I'm likely to favour certain people than you would, for instance. The issue is how to separate these personal preferences so that people are not unfairly discriminated against. This, as mentioned, does cause a problem when building a team - someone with the best qualifications on paper, and then does well at interview might not "feel" right compared to someone who you know that would fit with the team, but has fewer qualifications. I end up being ambivalent about this - on the one hand, it leads to Oxbridge cronyism in high places, but, at the same time, I have been the beneficiary and the benefactor of this at various times.

          1. Nick Z

            Re: The issue is how to separate these personal preferences ...

            I'd say that the more informal is your decision making, the more your personal prejudices will come out and play a role in your decision.

            Because when you don't follow some kind of explicit rules, that other people can look at and decide whether they are fair or not, then it's just your personal opinion that you use to make your decision.

            When people are prejudiced, then they usually don't know that they are prejudiced. They think that they are fair and square.

            The only way to become aware of your prejudice and correct it is to let many other people look your decision-making and let them judge whether it's fair or not. Which isn't possible, when you base your decision on how you feel, rather than follow some explicit rules that are available for other people to look at. Your feeling is available only to you and not to other people.

            Informal decision-making, that's opaque and cannot be examined by other people, is probably another big cause of discrimination in the IT industry.

            1. werdsmith Silver badge

              Re: The issue is how to separate these personal preferences ...

              It's quite clear that when choosing who to exterminate, the Daleks are quite blatantly discriminating against time-lords and humans.

  6. lone_wolf

    Freedom of Association

    So no one ever discusses how anti-discrimination laws infringe on freedom of association?

  7. Rajiv_Chaudri

    Humans learn their prejudice through statistics and Big Data.

    Occams razor. Stereotypes and prejudices are often based in math and numbers. They are called "cliches" for a reason.

    Made up stereotypes never stick as there's no truth to them.

    1. DryBones

      I'm going to half-disagree. Stereotypes are generalizations. Prejudices are judgements made about the value of those things stereotyped.

      Black/white

      iOS/Android fans

      Engineers/Managers

      Christians/Muslims

      Chocolate/Strawberry ice cream...

    2. Nick Kew

      Humans learn their prejudice through statistics and Big Data.

      That sounds more like postjudice than prejudice.

      Or am I old-fashioned, taking the word to say what it means?

  8. John Smith 19 Gold badge
    FAIL

    TL:DR. Biased training sample --> "Computer says no."

    Now....

    Is that sample biased for (or against) a group because it was not balanced to not be (IE selected by incompetents who did not understand this is an issue in ML)

    Or

    Was it selected to ensure exactly the outcome it's giving?

    The first means the devs are responsible. The second means both the devs and whoever told them to do it are responsible.

    1. cream wobbly

      Re: TL:DR. Biased training sample --> "Computer says no."

      Nope, all are responsible for skewing the data in both examples. Incompetence doesn't absolve someone of responsibility.

      What you're angling for is intent: you're actually, really, begging the question.

      1. John Smith 19 Gold badge
        Unhappy

        "Incompetence doesn't absolve someone of responsibility."

        Nor did I suggest it did.

        Perhaps you'd like to read what I wrote again, slowly.

        You appear to be having trouble parsing it.

  9. ZenCoder

    Software is now giving sentence recommendations in the USA.

    Research Compas from a company called Northpointe. Based undisclosed methods it spits out a pie chart that allegedly represents a convicted criminals likelihood of reoffending and is used to determine the severity of sentencing. The convict has no ability to examine or refuse any part of the system as it involves secret algorithms and proprietary data.

    It is allegedly it tends to give too little weight to an individual's actual offenses and instead focuses heavily on the neighborhood they are from and their education. So you are literally being punished for the crimes of your neighbors or rewarded for growing up in an affluent area.

    True or not there should be no place "for profit" agencies or activities or secret evidence and procedures in criminal justice.

    1. bombastic bob Silver badge
      WTF?

      Re: Software is now giving sentence recommendations in the USA.

      "llegedly it tends to give too little weight to an individual's actual offenses and instead focuses heavily on the neighborhood they are from and their education"

      if this is actually TRUE, then the appeals lawyers would have a FIELD DAY with it. So I suspect it is not. Unless the people who approved it are dumber than dirt. which is possible...

      1. ZenCoder

        Re: Software is now giving sentence recommendations in the USA.

        "Unless the people who approved it are dumber than dirt. which is possible..." not just possible but a proven fact in multiple states.

        However other States are being sensible and disclosing to the public exactly how their risk assesment algorithms work.

  10. Anonymous Coward
    Anonymous Coward

    "[...] while differential pricing of motor insurance according to gender is unlawful in the EU, although it still happens."

    A friend who had moved his family to Spain explained that the family car was insured in his name. His wife - with a much better UK insurance record - was quoted a higher premium.

    1. werdsmith Silver badge

      Should have gone to Sheila's Wheels..

  11. The Nazz

    Just how prejudiced is the author herself?

    "So ... or refusing credit to inhabitants of Bradford could both fall foul of discrimination law. The ..second, because it is likely to discriminate against individuals from a particular ethnic group."

    Seriously? Refusing credit to (all) inhabitants of Bradford could discriminate against a particular ethnic group?

    Would she care to clarify which group she is inferring could be discriminated against?

    Surely she knows that Bradford is one of the earliest, well before 1900, and most ethnically diverse cities (and towns) of the UK. There can barely be an ethnicity in existence that isn't represented in Bradford these days.

    Refusing credit to Bradford discriminates against every single inhabitant, whatever ethnicity, race, religion and whatever the other "protected" categories are, they are from.

    Any reason why Bradford was specifically chosen? A prejudicial choice?

    1. Anonymous Coward
      Anonymous Coward

      Re: Just how prejudiced is the author herself?

      I'd choose Bradford. To test a neutron bomb, at any rate.

    2. Anonymous Coward
      Anonymous Coward

      "Any reason why Bradford was specifically chosen? "

      Because a DNA survey suggests most of them are part of the same extended family?

  12. Pascal Monett Silver badge
    Flame

    "Target identified [..] a teen girl" being pregnant

    Sorry, but to me that is clearly an invasion of privacy and way out of bounds.

    It just shows that marketing has no barriers or moral. It's all about making money at any cost.

    Of course, it would be a lot harder to code all those ad algorithms with somebody continually asking what the consequences are, but the fact is they have been coded without anyone asking what the consequences are.

    So the consequences are just showing up on their own. And real people are bearing the cost of that, not companies who are foisting their impersonal bots on us.

  13. OrneryRedGuy

    Unintended discrimination

    ML is absolutely crap at differentiating between cause and correlation. If you train it on data that contains a discriminatory bias, it will learn that bias. That is fundamental to its very design. Unfortunately such biases are abound in most real world data. A relative dearth of female engineers in the field may lead ML meant to evaluate new hires to turn away women, perpetuating the cycle. (Please save the debate about whether we need more female engineers or not. I think we can all agree that even if she is an exception, there could be a woman candidate more than qualified for such a job, and if your ML turns her away it's discriminatory and has done both her and your company a great disservice.)

    Until ML can ask "why" about the correlations it finds, we're going to have to evaluate human beings as the individuals they are and not rely on shortcuts. Problem is, clearly nobody wants to have to do that.

  14. Mike007 Bronze badge

    Cause and effect?

    I raised what I considered an interesting legal point with my University. I queried how lecturers advertising scholarships containing restrictions on gender was consistent with the Equality Act.

    The university administration initially claimed not to understand why there would be an issue. I pointed out that technically speaking as there were 5 female-only scholarships offered to a group containing 4 female students this would mean they would need to offer over 300 male-only scholarships in order to ensure they weren't financially discriminating against those students based on gender. They referenced what appeared to be generic legal advice about promoting under-represented groups, which was based on a section of legislation that had been overturned by the EU courts and repealed due to its discriminatory nature. They didn't reply to my email pointing this out.

    To bring it back on topic: What would an AI learn from that data?

    It might for example find a bias whereby the group of students who got scholarships have a higher minimum competence than those that didn't. An AI would be completely oblivious to something like social factors making a female student who doesn't know how to use a command prompt less likely to take a university level IT course than a male student of the same ability level.

    The AI might instead learn that students who get scholarships perform less well than their non-scholarship-receiving cohorts. There could be a bias whereby scholarships encourage people who are less knowledgable about the subject to "give it a go".

    Would the AI be more or less biased by including gender in the training set? This is a training set whereby the "top performers" will pretty much all be male, yet the females are all being paid more money.

  15. Kjeld Flarup
    Alert

    Computers should not choose people

    In some science fiction computers were used to take out the part of human prejudice. We just learned that this is a hard task to acheive.

    But let us assume that we could a clearly serious and honest algoritm, which feeds correct data to a ML system, so it can find the right person for a given job.

    And based on rational input and serious evaluation - it finds, that red headed people are never qualified for any job.

    There are two pain full lessons to learn on this.

    One is that people are different. A honest ML system can't put PC glasses on and ignore this. Thus if all people were chosen by this rational ML system, there would be people or even groups who NEVER got a job.

    The second lesson is that trial and error has been eliminated. You never will get the unorthodox person employed, who gets the next brilliant idea. In a sense, we may stop development, and perhaps even man's ability to adopt to changes.

    So letting computers chose people is perhaps a bad idea. But I'm certain this bad idea is being persued a lot of places to day.

    1. OrneryRedGuy

      Re: Computers should not choose people

      For many jobs, an employer doesn't want creativity. They want a replaceable cog, to replace the previous cog that burned out/fell into the mixing vat/somehow made it to retirement age. Even if creativity could be a benefit, that's a crapshoot far outweighed by the dependability promised by ML (whether delivered or not). Hence, there will always be the incentive to automate it.

      It might even work, from the manager's perspective. Sure, if the ML is discriminatory some worthy candidates get binned, but who cares so long as it produces enough (blond, blue-eyed male) cogs to keep things churning?

      Well, we as a society care, or should. ML right now serves to detect patterns and, in the context of this article and discussion, reinforce them. Long term, this is bad for everybody not in the historically favored group, which is bad for everybody who IS in the historically favored group. A truly rational ML system needs to be able to determine which trends are important ("is literate") and which are specious ("is a member of X ethnic group"). Direct discrimination is easy to squash - tell the ML to ignore which gender box is checked or what the surname is. But indirect discrimination isn't so easy.

      Even if there is a fundamental disparity in the abilities of different groups of people (redheads, in your example), when evaluating an individual you must acknowledge that they may be an outlier. Being PC is a lazy solution. Being decidedly non-PC is just as lazy. I would argue that PC-ness shouldn't be a consideration; find the best person for the job (I'm still speaking in an employment context, obviously). ML will by design maintain the status quo, perhaps with greater efficiency. Do we want to change the status quo (I think we do)? Do we want to use ML? If the answer to both these questions is yes, we've got a lot more work to do on ML before it's ready to pass judgment upon us.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like