back to article Bosses using AI to hire candidates risk discriminating against disabled applicants

The Biden administration and Department of Justice have warned employers using AI software for recruitment purposes to take extra steps to support disabled job applicants or they risk violating the Americans with Disabilities Act (ADA). Under the ADA, employers must provide adequate accommodations to all qualified disabled job …

  1. druck Silver badge

    Computer vision software analyzing a candidate's gaze, facial expressions, or tone is not appropriate for those who have speech impediments, are blind, or paralyzed. Employers need to take extra precautions when using AI in their hiring decisions, the document advised.

    It's not appropriate full stop. As a human I expect to be interviewed by other humans, if they are not up to screening candidates themselves and need help from so called AI, that tells me everything I need to know about that company, and certainly wont be working for them.

    1. An_Old_Dog Silver badge

      AI/ML: overhyped, frequently mal-used

      The problem with AI/ML is that you can't determine how it got its results, so you've no idea whether those results are accurate or fair.

      For some problems, such as heuristic railcar routing used during the early years of computers, the bad results (occasional misrouted railcar) were acceptable compared to the good (ability to get timely solutions from slow hardware). AI/ML is these days applied to problems where the bad results (mal-discrimination) should be considered as outweighing the good.

      For evil-minded assholes actively, wrongly-discriminating against various groups of people, AI/ML is manna from heaven. "No job/loan/insurance for you! 'Cause computer says so. Next!"

      A subpoena of conventional, deterministic computer code and examination by experts can reveal intentional mal-discrimination in that code. You can't find mal-discrimination in AI/ML code: it's in the datasets. You can't "prove" mal-discrimination in an AI/ML dataset; you can only infer it, which is a lot of technical/legal wiggle-room for any mal-discriminating assholes who ever do end up in court.

      1. SVD_NL Silver badge

        Re: AI/ML: overhyped, frequently mal-used

        The worst part of it is, there are ways to keep your AI/ML under control, but you need to design it with that in mind from the very beginning. And even then, you'll need to constantly keep looking into it to see what it's doing, and make sure it's not come up with some crazy solution that doesn't make sense but fits the data.

        I'm personally against using this opaque type of AI for these tasks. AI is discriminatory by nature, it's what it's designed to do, and it will just go with what you give it.

        Here is a Reuters piece about an AI Amazon used in their hiring process:

        Turns out that if you have a sexist hiring process for (at least) a decade, you turn that into a dataset, and feed it into an AI, the AI will be sexist too! Who would've thought. Assuming Amazon had good intentions here (which may be a bit of a stretch), it just shows how difficult it is to rid your dataset of any accidental discrimination. Confounding factors can be a massive pain too, especially with self-training AIs.

    2. Yet Another Anonymous coward Silver badge

      > that tells me everything I need to know about that company, and certainly wont be working for them.

      No you won't because this software isn't for your sort of job.

      The jobs reg readers are applying for filter out resumes that didn't include keywords: C/C++, Python, SQL or VERB-of-the-MONTH.js in the first paragraph. Then they filter out the ones that don't have a CS degree from a top 10 uni (candidates over 30) or an internship at a FAANG (candidates under 30) = all a very human based process.

      But if you are recruiting 100,000 Amazon warehouse staff this month - you aren't going to have the CEO contact each of them to ask them questions about where they see themselves in 5 years and describe a time when they failed and what they learned from it.

      1. Neil Barnes Silver badge

        Indeed you can't. But equally, the questioning is probably much more focussed: Will you turn up on Monday? Will you stay the rest of the week?

        You don't apply for a job as a shelf stacker because you see a great career in stacking shelves. You do it because you want to eat.

        1. Warm Braw

          There's a reasonable argument that recruitment for this type of job is best done simply by putting names in a hat. With the caveat that certain jobs might require a certified level of skill (such as being able to drive or to extract an appendix), I suspect the principle could be applied more widely without anyone noticing.

        2. fajensen

          You don't apply for a job as a shelf stacker because you see a great career in stacking shelves. You do it because you want to eat.

          That may be so, but, to get the job you first need to truly believe that your Dream and Only Purpose in Life is to become the worlds bestest, happiest, and most dedicated shelf stacker in order to first fool the AI, then HR and finally convince the recruiting panel!

          The cruelty is intentional.

      2. Jonjonz

        One manager tasked with hiring 1000 staff in a month. You wish.

        Maybe opening a new plant, but then there would be more than enough managers available.

        Using AI in hiring is business shooting themselves in the foot.

        Until you can trust AI to mow your lawn without the aid of electronic guides, it is not ready for prime time.

    3. Mark 85

      This does remind me of hiring practices back in the 70's. Part of it was polygraph being used. A so-called lie detector. Tests showed that it could be defeated by simple techniques and competence of the operator was always a problem along with biases. The courts tossed it finally for most part from being used by law enforcement.

  2. Kevin McMurtrie Silver badge


    It's best to never agree to those interviews. The interview is a preview of the job. Being treated like a low value and replaceable resource won't end after the AI screening.

    1. DevOpsTimothyC

      Re: Run

      Is it mandatory for companies using thses tools to disclose that prior to the candidate applying for the role?

      Until that is a legal requirement WITH penalties then companies will use the tools with impunity. How many people would terminate their employment if they found out they had been subjected to that?

    2. Mark 85

      Re: Run

      Being treated like a low value and replaceable resource won't end after the AI screening.

      Hmmm.... it's too late really. The department "Human Resources" says it all and just about every company of any size has one. That department used to be called "the employment office" or 'personnel". The de-humanization of staff started a long time ago.

  3. Magani

    Seems an apt quote

    "To err is human, but to really foul things up you need a computer."

    Paul Erlich

  4. Anonymous Coward
    Anonymous Coward

    Unfortunately the genie is already out of the bottle

    The problem with AI is you need a lot of data for your specific usage and someone experienced with creating AI solutions to check it’s performing as expected.

    Unfortunately companies are using “off the peg solutions” with poor end results.

    The US company I work for has just tried rating “customer sentiment” in support calls using one of the “packaged” solutions.

    The result was it flagged almost every call, and digging into it it seems the AI didn’t like words like “problem” or “issue” which you can imagine occur reasonably frequently in support calls, and no they hadn’t re-trained the data using stuff from our company.

    I note it’s showing no results now and management have stopped asking us to check it daily as they were.

    I remember someone once saying “we are idiots living in a world created by geniuses”!

  5. Wenlocke

    That whole "employment gap" thing has always annoyed me tremendously. Because the implication is that unless you were productive pretty much continuously, then you're useless. It also buys into the whole "being unemployed is a failure of character" take that we get from, well, mainly people whose livelihood depends on there being people to work for peanuts..

    I can kind of see a bit of suspicion in jobs where they do heavy vetting (does this person applying for a job in the security services have a nice large year off where they spent some time living somewhere suspicious? Might want to look at that,) but really, there's very little excuse for it outside of security and financial arenas.

    1. Anonymous Coward
      Anonymous Coward

      The security services sees all aspects of huminity and are the most understanding employers. Basically, all they really want to verify it that you are not lying to them and will not get them into hot water later …

      Mostly, they just don't care what filth you have rolled yourself in, unless for some specific transgressions related to our current crop of adversaries and some straigh-up criminal acts*. If you indeed did do some dodgy and frowned-upon by "society" things, then they just have more gurantees for your loyalty, by documenting it all in your service record.

      *) Looking at the UK, one could be lead to assume that being a pedo is not one of the "prohibited sports".

  6. jmch Silver badge

    "Computer vision software analyzing a candidate's gaze, facial expressions, or tone... "

    So, like a Voight-Kampff test for androids to run on humans instead of the other way round?

    1. Richard 12 Silver badge

      And you fail if you show any empathy, of course.

      1. jmch Silver badge

        You come across your Amazon warehouse colleague, Leon. He is lying on his back, collapsed under a pile of boxes. he is asking for help.... but it's your 30-second pee break Leon, and your supervisor has already warned you not to exceed the time, Leon. What do you do, Leon?



  7. Cederic Silver badge

    state sanctioned racism

    It's frustrating to read the recommendation that hiring should score people differently based on innate characteristics.

    If the algorithm measures things that typically score differently for people with different racial backgrounds, so what? Sports franchises should recruit racially balanced teams? No, they're looking for a specific characteristic and should hire people with that characteristic, irrespective of race, gender, height, disability or anything else.

    So should everybody else.

    My current employer doesn't even know about my disabilities, two of which actively inhibit my ability to do my job. I didn't tell them; they need someone capable of X and I'm happy to be measured against that need.

    Get AI out of employment because it's not AI, it's a badly tuned algorithm that lacks the nuance to properly assess candidates. But also employ people that meet your needs, and don't score them differently based on innate characteristics.

    1. Richard 12 Silver badge

      Re: state sanctioned racism

      Oy vay...

      If you score candidates using metrics that are strongly affected by discrimination external to your company, then your score will be illegally discriminatory, even though you did not intend this.

      For example, if you primarily score on whether candidates went to Eton (or similar), you will primarily hire white men with very rich parents, as women and those whose parents were below the 99% income percentile cannot attend.

      Have a look at the UK Conservative Party for an example of such hiring practices.

      1. Cederic Silver badge

        Re: state sanctioned racism

        Happily the law in the UK forbids indirect discrimination, whether through use of a computer algorithm or in person, so requiring attendance at Eton would already be illegal.

        (Except for the Conservative Party selecting candidates for election, as the Labour Party put an exception into anti-discrimination legislation so that they could legally discriminate against men - not even indirectly, very explicitly. Which is how we got political luminaries such as Jess Philips and Angela Raynor, who make such a compelling contribution to Parliament.)

    2. yetanotheraoc Silver badge

      Re: state sanctioned racism

      "My current employer doesn't even know about my disabilities, two of which actively inhibit my ability to do my job. I didn't tell them; they need someone capable of X and I'm happy to be measured against that need."

      It was probably best you didn't tell them.

  8. hammarbtyp

    Here we go again

    It was common until quite recently to employ a graphologist (I.e a handwriting analysis) to filter job candidates. This technique was widely used to ascertain candidate characteristics based on their handwriting. It may still be used in places. The only problem with it was the science it was based on was total junk and was the equivalent of choosing someone on their star sign.

    Choosing candidates via AI seems to me similar with an added flavour of special techno sauce which "magically" chooses the most capable candidates and therefore becomes a sort of cargo cult to the ignorant HR masses. I expect stories in HR weekly saying how since their new RoboCandidateMatic was installed the caliber of candidates has improved immeasurable. It will only be in 10 years hence when a proper analysis is done it will be shown to be not much better than a random selection

    The best way to choose candidates will always be via human to human interaction. Outsourcing that to our robot overlords will mean that you will end up at best with a group of group think identi clones who meet some secret AI algorithm

    1. yetanotheraoc Silver badge

      Re: Here we go again

      "The best way to choose candidates will always be via human to human interaction."

      Sounds too much like work.

  9. EricB123 Silver badge

    What a Crock of Shit

    Over my career I have had to put up with countless interviews where the questions I was asked had absolutely nothing to do with the job i was interviewing for.

    I got drilled on how to make the big O of an algorithm smaller by some specified but random amount, could I code the Towers of Hanoi in three chords or less, I mean 10 lines of infinidash, you get the idea. But in reality the job entailed taking over the software maintenance of the last guy who qiut, got fired, retired or died.

    My point is trying to introduce the concept of fairness in interviews like this ... Well, you het the idea!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like