back to article California suggests taking aim at AI-powered hiring software

A newly proposed amendment to California's hiring discrimination laws would make AI-powered employment decision-making software a source of legal liability.  The proposal would make it illegal for businesses and employment agencies to use automated-decision systems to screen out applicants who are considered a protected class …

  1. sreynolds

    The burden is as alaways...

    Good luck proving this. I guess it depends on how the legislation is written but good luck anyways.

    1. Anonymous Coward
      Anonymous Coward

      Re: The burden is as alaways...

      You just outlaw automated sorting of resumes, "AI" or not. Period. It does not work anyway. In the States, if it's being used, it will come out in the discovery phase unless the company wants to compound it's woes by engaging in perjury. I doubt many HR employees would want to perjure themselves "for the company".

    2. Yet Another Anonymous coward Silver badge

      Re: The burden is as alaways...

      State has a bot that submits the same CV with male/female/different ethnicity names - runs stats on the results - take company to court

      Company either has to argue that it is guilty of using ML which just looks for CVs that match the board members OR it has people deliberately screening out women and low albedo candidates

  2. Anonymous Coward
    Anonymous Coward

    I guess I will have to wait then

    Before I publicly comment that they should probably scrap this and start over. Not because the basic idea is bad, just that this is exactly how idiots over-regulate things without actually accomplishing their goals. This proposal is a mountain of bureaucratic paperwork with no teeth to prevent abuse. As sreynolds said, good luck proving this. Having the model data does not mean the regulator will understand any of it. The way it's written it is only useful in a prolonged investigation, which is to say after someones rights have been violated.

    Scrap this and replace it with language requiring companies that want to use this kind of technology to show that they understand and can control it's biases BEFORE it's deployed in the first place.

    We make them do drug trials, we should make them do something similar for machine learning algorithms.

    1. Yet Another Anonymous coward Silver badge

      Re: I guess I will have to wait then

      >Not because the basic idea is bad... without actually accomplishing their goals

      Cynically you would say that this achieves the goals of telling their voters that "we stopped evil AI" while telling the businesses that fund them "don't worry it won't change anything" while the lawyers on both sides figure they can bill $$$ arguing over it

    2. Snake Silver badge

      Re: I guess I will have to wait then

      Swcrap this and replace it with language requiring "companies that want to use this kind of technology to show that they understand and can control it's biases BEFORE it's deployed in the first place"

      And exactly how do you propose this? The vast, vast majority of people using tech have no idea how it works let alone prove that it is working as expected - if they understood the tech before it was implemented, World+dog wouldn't even be having this discussion. They would understand the limitations and either program accordingly or only use the program fire guidance, not final decisions. It is the lack of full understanding of what is happening inside these black box "AI" programs that is causing the concern.

    3. Kanhef

      Re: I guess I will have to wait then

      Rather than banning the use of automated systems, simply make a law that companies are liable for any bias in those systems, whether or not it was intended. Companies will either stop using it, or their lawyers will insist that vendors prove that their system isn't biased before using it. Win-win.

  3. analyzer

    HR huh?

    As if we needed anymore proof that HR has nothing to do with humans

    1. Neil Barnes Silver badge
      Flame

      Re: HR huh?

      At least when a company has a personnel department (are there any left?) it had the illusion that an employee was a person. Human Resources is just treating people as plug-in replaceable parts.

      You'd think they'd have learned...

      1. Anonymous Coward
        Anonymous Coward

        Re: HR huh?

        At my place "Human Resources" have renamed themselves "People Services" and call all their staff "partners". They are still, as they always have been, utterly useless.

        1. Anonymous Coward
          Anonymous Coward

          Re: HR huh?

          Ours are now People & Capabilities with P&C business partners. To be fair, they’re the best I’ve worked with but still have huge difficulty dealing with anything that’s not really straightforward. They’d rather take the easy option, no questions asked.

        2. YetAnotherXyzzy

          Re: HR huh?

          Only merely useless? Not outright noxious? I'm jealous.

  4. FozzyBear
    Terminator

    So HR staff have been replaced with cold uncaring, faceless, soulless machines with a narrow band of "intelligence"

    Yeah, to be honest a distinction without a difference

  5. trindflo Silver badge
    FAIL

    Equal opportunity

    "That same study suggests that HR software of the kind covered by the proposed California law is one of the reasons why employers are having trouble filling roles, too."

    Seems like both the candidates and the businesses are getting bot-humped by the same tool.

  6. Neoc

    If human beings can be made to justify their decisions, then AI software should also. If the software can't tell you WHY it made the decision, then hold the users in contempt. Yes, the users. Start fining the people/companies who use the software and see how quickly they flee developers who sell black boxes.

  7. Walt Dismal

    Illustrating what is wrong in the industry

    Let me illustrate exactly how ATS software is flawed and discriminatory.

    Suppose an HR department uses externally supplied software (some service providing it) that rejects using an experienced doctor because he has had too many patients. Or lawyer because he's won too many cases.

    Sounds absurd? Well, what has been happening is that companies who put up job solicitations for contractors REJECT experienced contractors for having too many projects on their resume. The flawed reasoning used by the ATS systems is that if someone has too many jobs, they are automatically classified as a job-jumper and unreliable. And so the applicant gets rejected, no matter how qualified or how good a fit they are. The problem comes about because ATS software companies wrongly apply criteria suitable to judge salaried employees, instead use these to judge contractor technologists. The HR department receives data with the applicant scored low because of wrong criteria application, and so a needed qualified candidate is rejected.

    My personal experience with this has been incredibly frustrating. and it shows the need for legislative action. I've actually had several cases where I was provably the only one in the industry who matched certain criteria (because I was the one, the originator, who created the knowledge area they were looking for, and there are no other experts on it yet!) but I was rejected on the grounds of too much experience (too many jobs). Frustrating.

    There are certain ATS software suppliers whose product is garbage but they have strong sales departments. These companies need to be reined in, legislatively.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like