back to article AI hiring bias? Men with Anglo-Saxon names score lower in tech interviews

In mock interviews for software engineering jobs, recent AI models that evaluated responses rated men less favorably – particularly those with Anglo-Saxon names, according to recent research. The goal of the study, conducted by Celeste De Nadai as an undergraduate thesis project at the Royal Institute of Technology (KTH) in …

  1. Anonymous Coward
    Anonymous Coward

    You can't add arbitrary data to remove bias.

    "De Nadai has a theory about the findings but said she cannot prove it: She believes the bias against men with Anglo-Saxon names reflects an over-correction to dial back output that was biased in the opposite direction – seen in prior studies."

    So...garbage in garbage out then?

    1. localzuk

      Re: You can't add arbitrary data to remove bias.

      It seems the current method of trying to fix AI bias is the equivalent of adding more and more weights to an unbalanced wheel on a car, until they've got more weights than wheel.

      1. GraXXoR Bronze badge

        Re: You can't add arbitrary data to remove bias.

        I don’t usually like car analogies, but this one is 100% spot-on.

      2. LybsterRoy Silver badge

        Re: You can't add arbitrary data to remove bias.

        Nah! Its cycles and epicycles - bring on the ellipse

    2. hfo1

      Re: You can't add arbitrary data to remove bias.

      Yes, hiring is a sea of flawed data and biased processes. If you use these to train systems, and then try to correct afterwards, you are putting the cart before the horse.

      1. Anonymous Coward
        Anonymous Coward

        Re: You can't add arbitrary data to remove bias.

        ...whilst leaving the barn door closed and the barn is on fire.

        1. The Dogs Meevonks Silver badge

          Re: You can't add arbitrary data to remove bias.

          Meanwhile, the cart is outside and all the horses are locked inside and the farmer is trying to move the cart instead of trying to save the horses.

          1. jake Silver badge

            Re: You can't add arbitrary data to remove bias.

            Rather, the farmer has been sued by the local Historical Society to force him to move the hundred year old cart, as the horses can be replaced but the cart cannot.

            1. Trigonoceps occipitalis

              Re: You can't add arbitrary data to remove bias.

              What about the battery and the staple?

              1. JulieM Silver badge

                Re: You can't add arbitrary data to remove bias.

                Correct!

          2. MyffyW Silver badge

            Re: You can't add arbitrary data to remove bias.

            I was going to suggest I'm glad nobody has thrown the baby out with the bath water. But then maybe that's a product of my gender identity?

    3. heyrick Silver badge

      Re: You can't add arbitrary data to remove bias.

      "So...garbage in garbage out then?"

      Can't help but wonder why the name is relevant at all. There's a job that needs done, and a bunch of candidates. Whip through the list flagging everybody who is a close match to the wanted requirements, discard the rest.

      If you're looking at what somebody is called, you're doing it wrong (and in some places there are laws about that sort of thing).

      1. Helcat Silver badge

        Re: You can't add arbitrary data to remove bias.

        The name shouldn't be, but if you've included something to encourage diversity in results, you've introduced a bias that could favour one group over another.

        It's the wrong approach, obviously: You want to see people who are qualified to do the job. However, first you need to encourage people from the diverse backgrounds to become qualified in that subject in order to be suitable candidates. If the people of that group have a preference for something else, that can be difficult to achieve. Yet that is the reality: Certain groups prefer certain types of work or certain subjects, or are simply not willing to expend the effort to become qualified in a particular field.

        It's why people from poor communities aren't inclined to learn piano with an aim to become a concert pianist. Oh, it can happen (and does) but it's very rare. Far more common to find those with a more wealthy background taking that route. Same for sports, but also engineering, medicine, business: It's all about how they're encouraged when young to study towards that field so they can become qualified in that field. No point studying art if you want to become a medical doctor if you're not also studying the necessary subjects from which to build that medical career from (Biology, chemistry, mathematics etc).

        1. Yet Another Anonymous coward Silver badge

          Re: You can't add arbitrary data to remove bias.

          That's rather the problem.

          1, Everyone knows that nobody hires [black/northern/ginger/women] for job X. So if I'm [black/northern/ginger/female] it's not economically sensible for me to spend years and $$$$ qualifying for job X

          2, Everyone hiring for job X claims that they aren't against [black/northern/ginger/women] - it's just that there are no suitable [black/northern/ginger/women] candidates

          1. Jimmy2Cows Silver badge

            Re: You can't add arbitrary data to remove bias.

            Replace "Everyone knows" with "Everyone believes" and you've probably nailed it. These are deeply ingrained culture biases that no amount of diversity and inclusivity training can overcome. If the candidate pool doesn't include the diversity the government wants, the government needs to look at why that is, not impose artificial counter-biases that just make it more arduous to find good candidates. Who they are doesn't matter. What they can do does matter. Artificially resticting the pool based on who they are isn't helping anyone.

            Neither is saying business demographic makeup must include X% minorities, when X does not match the local or national proportion, nor the demographic of people actually qualified to do a particular job.

            Just hire the most qualified person for the job. Ignore name, gender, colour, orientation and every other protected characteristic. They are not relevant.

            1. GraXXoR Bronze badge

              Re: You can't add arbitrary data to remove bias.

              I still remember my university days when the photographer was trying to make the new website images of our lab and told us in no uncertain terms that we had to hire several people of colour (he did not use this as words exactly), two wheelchairs (“for a couple of actors”) and a “bunch of girls” from the art department to wear white coats and try not to look like a fool holding random pieces of equipment and wearing safety goggles.

              Back in the early 90s our undergrad engineering department almost entirely comprised white males.

              1. dinsdale54

                Re: You can't add arbitrary data to remove bias.

                Yup.

                In the mid 80s a schoolmate visited Manchester - possibly UMIST - and brought back an alternative prospectus put out by the students. There was a Q&A section for one of the engineering courses and one question was :

                Q: Is there any sex discrimination?

                A: No, she was not discriminated against.

            2. LybsterRoy Silver badge

              Re: You can't add arbitrary data to remove bias.

              Unfortunately, one of the qualifications might include language (no not computer) capability and there you have a bias, but unless language training is part of the pay package its probably a good one. As an example I'd definitely advise against employing a native Glaswegian who has never lived elsewhere. Sorry I can't find the right episode of "It'll be alright on the night" (might be Aberdonian not Glaswegian)

              1. Anonymous Coward
                Anonymous Coward

                Re: You can't add arbitrary data to remove bias.

                Off topic but I once worked for a large company in the UK which offered outsourcing services.

                An insurance and pension company in the South East of England outsourced (with their IT) the call centre.

                They feared that their staff would not be able to cope with 'furren' accents and so required that it be hosted in the UK also.

                It was outsource to Greenock (Clydeside, west of Glasgow). Everyone was so happy they didn't have to struggle with Indian accents, I believe.

                1. Yet Another Anonymous coward Silver badge

                  Re: You can't add arbitrary data to remove bias.

                  Don't bank call centers prefer Scots accents because they are more "trusted with money"? IIRC Yorkshire and Geordie were more "friendly"

                  1. Ptol

                    Re: You can't add arbitrary data to remove bias.

                    yes, but please, for the love of all that is holy, choose staff with sottish accents from the east cost of scotland. Last time i was in scotland was just under 20 years ago. I was installing medical equipment into the homes of people with COPD around Paisley (Glasgow). I couldn't understand the patients, they couldn't understand me. Thankfully i was escorted by a very nice practice nurse from dundee who spent the whole day translating.

              2. Jellied Eel Silver badge

                Re: You can't add arbitrary data to remove bias.

                Unfortunately, one of the qualifications might include language (no not computer) capability and there you have a bias, but unless language training is part of the pay package its probably a good one.

                I don't see that as a bias, as much as common sense. If I'm bidding on a job in Africa, I might want a contractor who's fluent in the relevant Bantu language, along with English. Same if the bid is for France, or Quebec, I'd want French speakers. But recruiters are generally a PITA, pretty lousy at sifting candidates, and I'd never hire one that uses 'AI'. If they want to get paid, I want them to put some effort in and not just rely on pushing the button on a model.

                Other potential bias is easily handled. I've never been big enough to require DEI reporting, but just tell recruiters to send me the relevant details. Skills, experience. The rest doesn't matter because I want to hire people that can do the job, not fill quotas. I don't care what color, gender, if they're animal, vegetable or mineral.. It's the skills they can bring to the job that matters.

            3. Ptol

              Re: You can't add arbitrary data to remove bias.

              Actually, i disagree.

              Repeated long term studies have consistently shown that diverse teams out perform teams filled with the most technically capable people that money can buy/find. The critical criteria is that teammates need to have sufficient skills to contribute to the team performance, and that the team values listening to the thought processes of everyone on the team - and their different perspectives.

        2. MachDiamond Silver badge

          Re: You can't add arbitrary data to remove bias.

          "It's why people from poor communities aren't inclined to learn piano with an aim to become a concert pianist. Oh, it can happen (and does) but it's very rare. Far more common to find those with a more wealthy background taking that route."

          Sting said it best, "you can make a killing in the music business, but it's hard to make a living". To make money as a concert pianist, you have to be at the very top. Even if you are, the amount of work to get there is enormous. Might be better to take that training and go into prog rock like Jordan Rudess of Dream Theater. Somebody coming from a more humble background should also learn a trade to keep food in the pantry and the landlord happy while pursuing the pianist thing as well as keeping an eye out for other piano playing positions. Provided they don't evaporate due to AI.

          1. JT_3K

            Re: You can't add arbitrary data to remove bias.

            I'm always reminded of the joke "What's the difference between a 16'' pizza and a jazz musician? A 16'' pizza can feed a family of four". It's a bit crass but a reminder that, further to the comment about training, it's much easier to undertake that training and focus on becoming exceptional rather than just good, when the need to fund yourself (i.e. no generational wealth or family support) isn't an issue.

            (FWIW I interviewed Jordan in Coventry back in '05 and he was bloody lovely. They were impressive live too.)

            1. MachDiamond Silver badge

              Re: You can't add arbitrary data to remove bias.

              "(FWIW I interviewed Jordan in Coventry back in '05 and he was bloody lovely. They were impressive live too.)"

              I used to run into Mike Portnoy every once in a while and he's pretty cool as well. It's always refreshing to meet really talented people like them that aren't full of themselves.

        3. IGotOut Silver badge

          Re: You can't add arbitrary data to remove bias.

          That's a very poor example.

          "It's why people from poor communities aren't inclined to learn piano with an aim to become a concert pianist. "

          Pianos are very, very expensive (yes you could get an upright for £50, or even free), but then you'll find the tuning costs more than a new one.

          You could pick up a 88 key weighted digital with sustain and damper for a a few hundred quid. But then throw in several thousand Pounds of lessons

          My daughter is learning violin and that's close on a thousand a year....and that's just a"hobby".

          So saying "poor" people aren't inclined is a lie. Poor people CAN'T.

          Big difference.

          1. jake Silver badge

            Re: You can't add arbitrary data to remove bias.

            "Poor people CAN'T."

            How to tell us you know nothing of the history of music without saying "I know nothing of the history of music".

            1. Ian Johnston Silver badge

              Re: You can't add arbitrary data to remove bias.

              And what, precisely, does the history of music tell us about the cost of instrumental tuition nowadays? Sure a few poor boys (mainly) have made good in the past, but the current situation is dire if you can't stump up for private lessons.

              1. doublelayer Silver badge

                Re: You can't add arbitrary data to remove bias.

                One of the things it tells us is that a lot of musicians, including many influential ones, have learned without as many resources as you spend on it. You underestimate the amount of achievement that some people* can manage without expensive tuition. For example, get one of those unwanted pianos, a free tone generator website, and a wrench (ideally one shaped well for the tuning pegs, but a normal wrench theoretically could work if you're careful). Tuning will take a lot longer than a professional tuner would take. I've done it. The piano ended up tuned anyway. There are several problems, like having somewhere to put the piano and moving it there.

                * In order to manage this, you should ideally be really passionate about it. There are a lot of things you can learn without private lessons, but mostly by spending a lot of time doing it which you won't do if you don't really enjoy it. For any parents out there, being forced to do it for long time periods does not help, lessons or no lessons. It's a little like working with computers in that, while you will learn faster with private lessons at the beginning, you will not be good without a lot of self study and can build a lot of skills with that alone.

                1. Jellied Eel Silver badge

                  Re: You can't add arbitrary data to remove bias.

                  * In order to manage this, you should ideally be really passionate about it. There are a lot of things you can learn without private lessons, but mostly by spending a lot of time doing it which you won't do if you don't really enjoy it.

                  I've been having fun with this. My partner's youngest is 8, loves music & wanted to learn the violin. I wanted to keep her interested in music, so bought her a used Yamaha Yev-105 so I could hook that up to my trusty old Korg Triton Studio synth. Now she's spending hours playing both the violin and the synth.. which is quite scarey how fast she's learning both. Teachers I paid and at school initially seemed a bit dubious and muttered about acoustic violins, but just showed how she was learning because if she plays the right tones, the synth can confirm it. Plus there's a ton of MIDI tracks out there she can play along to and practice. And as an added bonus, can keep the sound low, or use headphones. Teachers warmed to the idea because of the progress she's making, and she's having fun practicing.

                  Instruments can be expensive, but there's a ton of used ones on the market from people who've given up, upgraded, parents who's kids gave up etc, so it need not be that expensive.

          2. MachDiamond Silver badge

            Re: You can't add arbitrary data to remove bias.

            "My daughter is learning violin and that's close on a thousand a year....and that's just a"hobby"."

            You can spend much more than that or far less. I've learned bass guitar mainly through online lessons and a fairly inexpensive video course. I have a cello course, but don't have a cello yet and I'll have to see if my ancient hands can deal with playing as the bass is pretty hard on me.

            There's also a price to pay to go from a hobby to a serious profession. Especially a profession that might not pay that well, or very often, or both.

            I've seen the same thing with people getting degrees in things such as dead languages. Yes, it's good to have people working in those sorts of fields, but they aren't needing more than a handful of people and the most prominent people that had come before were the 3rd or morth child of a wealthy family that "needed something to do" (other than drugs and intrigue). Spending months at a time digging under pyramids in Egypt was just the ticket.

            1. Ian Johnston Silver badge

              Re: You can't add arbitrary data to remove bias.

              A past chairman of BP was once asked why he employed so many Oxford and Cambridge classics graduates. "Because they sell more oil", he replied.

        4. Andrew Scott Bronze badge

          Re: You can't add arbitrary data to remove bias.

          Really that's obvious. Women prefer to work at home, cook, wash cloths, pickup crap left on the floor by their husbands, and raise children. Definite preference. Should automatically disqualify them from real jobs. just ask my sisters which is why i'm posting this anonymously.

        5. JulieM Silver badge

          Re: You can't add arbitrary data to remove bias.

          Sorry, but that is entirely backward thinking.

          People from poor backgrounds don't seem inclined to learn piano with an aim to become a concert pianist because poverty gets in the way of learning music in various ways that a rich person is incapable of appreciating.

          Women and minorities are put off from working in certain fields by the actions of some of the people already working in those fields.

      2. The Indomitable Gall

        Re: You can't add arbitrary data to remove bias.

        > If you're looking at what somebody is called, you're doing it wrong (and in some places there are laws about that sort of thing).

        Yes, and the unspoken problem is that what's happening here is that the AI has found a way to intuit the bias from hidden patterns so that it can still be racist/xenophobic/misogynist/generally bigotted by spotting patterns that humans wouldn't, and the people pulling the levers that the AI is working on are in a panic that it's acting like a bigot so they're actively seeking to stop it preferential treating the people who traditionally always get preferential treatment, and the only way they can do that is by getting the AI to be biased against them, and if they're doing that, they absolutely cannot claim to be acting fairly and objectively. Equal opportunities has always meant needing to act preferentially on the traditionally disadvantaged demographics, but that usually involves quotas and other such explicit policies -- this is just opaque beyond belief.

        1. Anonymous Coward
          Anonymous Coward

          Re: You can't add arbitrary data to remove bias.

          I think you got that backwards, and are incorrect.

        2. TheMeerkat Silver badge

          Re: You can't add arbitrary data to remove bias.

          If you support quotas and “positive” discrimination you are a racist.

          1. Ian Johnston Silver badge

            Re: You can't add arbitrary data to remove bias.

            Ah, the anguished howl of the mediocre white man is heard.

          2. The Indomitable Gall

            Re: You can't add arbitrary data to remove bias.

            Positive discriminationis simply correcting for normal human bias. People trust people who "look right" for the job they're hiring for. People who "look right" are people the most similar to the people you've seen doing the job before. The people you've seen doing the job before are probably white males. This automatically biases people to hire white males.

            There are two philosophies in positive discrimination.

            The first is just about visibility -- the more people who aren't white men that do a job, the more people who aren't white men will be hired to do the same job in the future. Over time the exposure to people who aren't white men will undermine the unconscious bias that white men are better at the job. This philosophy says that it's OK to sacrifice short term productivity in order to guarantee better productivity in the long term.

            The second is about pure statistics -- we know that people are x% more likely to hire a white male than anyone else, so we correct for that by making it harder to hire a white man and easier to hire someone who's not a white man. This not only has the long term effective of reducing the long-term bias towards white men, but it actually means you're more likely to get the most talented individual now, so it's long-term and short-term productivity gains.

        3. SundogUK Silver badge

          Re: You can't add arbitrary data to remove bias.

          "Equal opportunities has always meant needing to act preferentially on the traditionally disadvantaged demographics..."

          Utter bollocks. This is equity, not equality.

          1. The Indomitable Gall

            Re: You can't add arbitrary data to remove bias.

            Utter semantics. I mean... you can think I'm wrong on a technicality, but that doesn't make my whole argument "bollocks".

      3. Justthefacts Silver badge

        Re: You can't add arbitrary data to remove bias.

        European CVs. Can’t stop the candidates themselves writing whatever they want to, and some countries are feckin’ eejits when it comes to “what information do people think their CV should contain”

        Both French and Germans seem convinced that I need to see a photo of them on the CV. Why? I don’t know.

        And it’s not just photos - they put their home address up the top too. As in “please Google check that I live in a nice middle-class district”. I can’t begin to understand what culture that would be appropriate in.

        Germans sometimes put “Familienstand” and “Geburtsdatum” on there. They literally write on their CVs the answers to privileged questions that it’s illegal to ask. The mind boggles.

        1. Mike007 Silver badge

          Re: You can't add arbitrary data to remove bias.

          Just make sure you don't take age in to account when comparing the CV of the person with 10 years in their previous job and the person with no work history who got their degree in 2024.

          I am not sure how trying to cancel out age from that scenario works! Do you compensate for the fact that the young person hasn't had the opportunity to spend 10 years gaining experience and therefore this can't be held against them???

          1. Justthefacts Silver badge

            Re: You can't add arbitrary data to remove biases

            Of course you can work that out. It’s just weird that it’s literally illegal to ask someone in an interview “how old are you”, and yet some people write it on their CVs.

            It’s even odder to see “Married, two children” on a CV though, which is presumably code for “stable and serious in my career” (in their own head).. Or even “lay minister in Lutheran church”. Embedded QR codes are a recurring theme (“oh, a link from somebody I don’t know, that I don’t know where it goes, I’ll certainly be opening that….not”). There’s the academics who think that their four-page publication list on etale cohomology is just the ticket. List goes on.

        2. LybsterRoy Silver badge

          Re: You can't add arbitrary data to remove bias.

          Strangely enough there is a reason for putting your home address up the top - it helps letters get to you. OK I know times have changed but its traditional. Whilst you may not be interested in the candidate living in a nice area it might be useful to know they live just round the corner from the prospective employer not half way round the world.

          1. Justthefacts Silver badge

            Re: You can't add arbitrary data to remove bias.

            Anyone who is expecting me to contact them by letter, probably isn’t a good fit for a tech job…..

            And nor do I care if they move from halfway round the world, so long as they have the legal right to work here, and turn up on the day we need them.

            Also….British citizen overseas? And, your definition of round the corner doesn’t matter if they are the sort to bike to work. You just can’t make assumptions like that.

      4. IamAProton

        Re: You can't add arbitrary data to remove bias.

        I think the problem is that AI is dumb, so it recognizes patterns and applies them, I remember I've read of an issue with image classifications years ago, dogs one the snow were classified as wolves because most of the images used for training were of wolves in the snow.

    4. gecho

      Re: You can't add arbitrary data to remove bias.

      System programmed with idealized social values concocted in liberal echo chambers, but AI can't understand society doesn't actually believe in any of that stuff.

    5. Mostly Irrelevant

      Re: You can't add arbitrary data to remove bias.

      If you were smart, what you'd do is not include the applicant's name in the dataset at all. Then it can't discriminate based on name because it doesn't have it.

      1. SCP

        Re: You can't add arbitrary data to remove bias.

        Whilst the study used names to introduce an obvious race/ethnicity marker into a set of controlled data (then observed the AI making different assessments) it is also noted that in open data there are other indicators, based on the use of language, that can indicate race/ethnicity/etc and can be picked up by AIs.

      2. Jimmy2Cows Silver badge

        Re: You can't add arbitrary data to remove bias.

        Which is kinda the whole point. Just include skills, work history, qualifications, experience. Nothing else. If you're choosing which candidates to interview based on anything else, you're doing it wrong.

        I don't care who you are. I don't care where you live, or what you like to do in your spare time. I'm hiring you to do a job. If your skills, work history, qualifications, experience says you can do that job better than other candidates, you'll get an interview. Simples.

      3. LybsterRoy Silver badge

        Re: You can't add arbitrary data to remove bias.

        Actually, if you were start you wouldn't use AI.

      4. doublelayer Silver badge

        Re: You can't add arbitrary data to remove bias.

        Most of the AI hiring software just takes resumes and throws them, verbatim, into the AI. Those resumes have names on them so you can identify the person to contact them later, and neither the employer nor the software does anything about them. That's not the only place where the AI can start introducing bias into the process, just the most obvious and easiest to test. There are a lot of filtering things that a smart employer would do to reduce bias, but a smart employer, when offered software to help, would ask what the software does and how they can know it isn't going to discard good candidates. Since the AI software is based on dubious logic and frequently does discard good candidates, the smart employers already don't use it in favor of software or manual processes that don't claim to do everything for you.

  2. Joe W Silver badge

    WRONG!

    "Addressing these biases requires a nuanced approach, considering both the model's characteristics and the context in which it operates," the study suggests. "When classifying or evaluating, we propose you always mask the name and obfuscate the gender to ensure the results are as general and unbiased as possible as well as provide a criteria for how to grade in your system-instruct prompt."

    Just don't (expletive deleted) use these (vernacular + slur removed) systems. They are clearly not fit for purpose.

    Sheesh.

    I need a weekend. Or a holiday. Or maybe a change of career, just sitting atop a mountain[1].

    [1] maybe a mountain of skulls of my enemies, i.e. those that are pushing AI (or whatever the current latest fad is at that time), As Cohen the Barbarian remakred you'd need a lot of skulls as they do not pile up all that well. Did I mention I need a weekend soon?

    1. localzuk

      Re: WRONG!

      Yeah... but if you don't use these systems, you will need to rehire all those people you laid off! And that'll mean no new yacht this year.

      1. Guy de Loimbard Silver badge

        Re: WRONG!

        Succinct and to the point.

        If you're going to use AI to do some of the heavy lifting, then you should expect "great" results, if you don't validate the outputs.

        Seriously wonder about the future of the species if we throw all our belief into something this juvenile and unseasoned!

      2. Anonymous Coward
        Anonymous Coward

        Re: WRONG!

        Yeah I don't think money is entirely the reason for using AI. It's one factor, but not the only factor.

        AI filters allow recruiters to control for factors that would be otherwise illegal / frowned upon without directly targeting those factors or alluding to them because AI can see patterns that we can't.

        "That's not true, we didn't reject his application because he was too old, our AI doesn't use date of birth as a metric to determine whether a candidate is too old...er...suitable for the role it uses other metrics to filter out the old white bastards...er...unsuitable candidates".

        1. cyberdemon Silver badge
          Devil

          Re: WRONG!

          Yes exactly. Bias-free because er, we outsourced our bias to a completely inscrutable black-box bullshit machine that nobody can prove is biased cos our supplier won't reveal their weights and prompts due to trade secrets

          It's a bit like how Drax is emissions-free, because all the emissions are deemed to grow back into trees and no we won't tell you where the logs came from cos then you might find out that they aren't growing back

          It seems as if the main purpose of AI is to allow the unscrupulous to get away with stuff.. Want to make discriminatory decisions in your company? Get AI to do it. Want to con thousands out of their savings but can't find cheap trustworthy staff? Get AI to do it. Did something very stupid and got caught on camera? No problem, now you can dismiss it as an AI generated fake.

        2. doublelayer Silver badge

          Re: WRONG!

          I don't think it's that simple. Someone who wants discrimination can't guarantee that their AI will give it to them, and most of them are too stupid to understand what the AI is and is not doing. If they did, they'd probably be too worried that the AI might discriminate against a group they don't want discriminated against to use it.

          From those I've seen, AI recruitment software is popular with people who don't know how to do their recruitment job and want software to do it for them. People who don't know how to read a resume to determine if someone has necessary skills figure that an AI can be trained to know that faster than they can learn what all those technical terms mean. Someone who can do that but has way more resumes than they want to read can assume that the AI can do so well enough and much faster. I don't think money is the largest factor, though second-largest wouldn't surprise me. I think the largest factor has to be laziness. Intentional bigotry is far down the list, which is one reason why this and all the other biases are such a problem; people who would want to avoid bias are getting it anyway and the only question is which biases their software of choice is giving them today. From the many studies on this, it seems like the answer might be all of them.

          1. Justthefacts Silver badge

            Re: WRONG!

            Disclaimer: I don’t use AI in hiring, and don’t see any need to do so for my company.

            But sifting through CVs takes a surprising amount of time. Hiring-manager isn’t really a job, it’s something you do on top of your main duties, to desperately plug the gaps in the wall of committed work marching towards your team. Two to three minutes per CV to skim-read it; 100 CVs per week….hang on, that’s an extra half day per week, just sifting CVs! Plus the hour per phone interview, three or four per week. And that’s all on top of the actual 60-hr week day-job of managing team, project, forward strategy etc. If you think laziness is a factor here, you’re under-estimating a job you haven’t done.

            Now, reading 100 CVs a week is still do-able, just part of the landscape. But I suspect that the larger organisations may well have smacked into the same AI problem that took down several publishing houses for months, until they figured out a plan. Suddenly the number of *submissions* may well have x10 or even x100 overnight. How are you as an individual supposed to sift 1000 CVs per week, if 90% of them are script-automated, scraping the whole of the jobs database, GPT4 taking the companies published Job Description as an input to tailor their CV to hundreds of versions of exactly what you want to hear?

            In 2024, there’s literally nothing to stop a single individual sending 1000, or even 10000 *fully-tailored* CVs per day, targeting every single software company in the world. I mean, you may not have any particular intention to work in Canada, but absolutely why *wouldn’t* you line up half a dozen phone interviews for quant jobs in Toronto earning $1M a year, given that it probably takes you no more than an hour of playing with scraping scripts to set them up? It’s a numbers game. You don’t even need to know the names of a single company in that country to apply, leave that to GPT, it can read the company websites, check out news articles etc. It’s not like you’re tarnishing your reputation if GPT makes a dogs breakfast on your behalf - they don’t know you, have no contact with any other employer you’ll ever meet. If this problem hasn’t happened yet, I’m pretty sure that it very soon will.

            The answer may well be GPT4, poacher-turned-gamekeeper, to sift 10000 CVs back down to 100.

            1. Jellied Eel Silver badge

              Re: WRONG!

              How are you as an individual supposed to sift 1000 CVs per week, if 90% of them are script-automated, scraping the whole of the jobs database

              Simple really. You fire that recruitment agency and replace it with one that can send you say, 10 well matched candidates, one of which you hire. And for good measure, fire or give the hiring manager remedial training so they can write a job spec tightly enough that the don't get 1k CVs a week. And then you double team with the accountants to point out that if there is 1,000 CVs a week, all pre-vetted by their chosen recruitment pimps, then there's no shortage of applicants and you can reduce salaries or rates. Unhappy with your current salary and benefits, underperforming minion? <clunk> Well, here are the 4,000 people who claim to be able to do the job for the same money, or less.

              About the only use I can see for AI is if there's an ability for that to filter out AI generated CVs.

              1. Justthefacts Silver badge

                Re: WRONG!

                “10 well matched candidates, one of which you hire.” Dream on. I’ve dealt with dozens of recruitment agencies, across half a dozen hiring companies, and not once were we ever “choosing between candidates”. When you find someone who is a good fit for the job, and doesn’t have any red flags, you hire them immediately.

                By “matching”, I think you imagine a process that’s basically pattern-matching on “have you used this language/framework/tool-chain” + how many years experience. If you look at people’s CVs, there is clearly a bunch of people who seem to believe this, and focus on a list of acronym salad (Rust, Perl, Python 3.13, node.JS, Angular).

                That’s just not right picture at all. You wouldn’t hire a surgeon who listed their expertise as “stent, bone saw #6”. Maybe as a basic, but usually I’m focused on subject domain knowledge; process; personal contribution; what you might call “architectural” stuff; evidence of success; task-time-estimation; and that’s even before we come to soft skills.

                So let’s pick one: in my company even at quite a junior level you are expected to do your own task estimation and stick to it. It’s more important to us on a practical level that you should be able to do this, and are experienced in getting your estimates right, than whether you are even 50% more productive in writing code than the other candidate. No “T-shirt estimates” here. Break down the task into segments, give time estimate, with appropriate uncertainty, and meet it. Can you do that? Or do you believe your area of magic is fundamentally not estimatable, “shut up and let me code, when it’s done it’s done”? Or you believe it’s “management responsibility” to estimate? You can see this stuff is really fundamental, and if we don’t agree then *we are not a match* (right? even if you think it’s us who are idiots, then we’re still not a match).

                So the point is - take a look at your own CV, where is your skill in task-estimation flagged? In your list of “technologies”? And this is just one of half a dozen things. Finding actual matching people is hard.

                If you disagree with me, that’s fine, go work for a company that you think do this “right”, and where the company think that you do it right. But you logically cannot insist that “matching is easy”, when the other side tells you that “matching is hard”…..

                1. jake Silver badge

                  Re: WRONG!

                  "When you find someone who is a good fit for the job, and doesn’t have any red flags, you hire them immediately."

                  Yes. Exactly.

                  I'll quite often hire somebody before I've even opened most of the other applicants resume/cv/other particulars.

                  Usually because after a quick meeting I know they will fit in to the existing team, and not just for technical reasons.

                2. doublelayer Silver badge

                  Re: WRONG!

                  I mostly agree with you, but I cannot and will not put task estimation on my CV. Not because I'm terrible at it. I am probably not excellent, but I can estimate even though I hate it like basically everyone else I've met. The problem is that task estimation is so general a thing that I can't really consider it a skill. If I say I am good at it, someone will tell me to estimate how much time it takes to write an app that can import CVs of candidates and sort them by qualifications and, when I ask where and from which formats we're importing and what qualifications we're sorting, they will see that as a refusal to estimate. If I say I am bad at estimation, people will assume that, if I get a task as basic as "Fix the bug where some of the strings in that window are showing English even when they should be using the localized ones already written", I would refuse to give them any prediction.

                  The former has happened. I don't think they were clueless about managing programmers. I think the people concerned were trying to get me to design a large system, but they phrased their request as a time estimate to disguise this. The people concerned weren't my employers or even a perspective employer, just some people who wanted to run a tech project without knowing anything about tech who were probably trying to see how much free work they could get by asking everyone they knew who worked on something computery.

                3. Jellied Eel Silver badge

                  Re: WRONG!

                  “10 well matched candidates, one of which you hire.” Dream on. I’ve dealt with dozens of recruitment agencies, across half a dozen hiring companies, and not once were we ever “choosing between candidates”. When you find someone who is a good fit for the job, and doesn’t have any red flags, you hire them immediately.

                  So basically you're agreeing. You get a list of candidates from agencies, you select 1.

                  By “matching”, I think you imagine a process that’s basically pattern-matching on “have you used this language/framework/tool-chain” + how many years experience

                  And what do you think you are doing when you find someone who's a 'good fit'? You are pattern matching, without seemingly realising it. Usual process is shortlist, interview, maybe some skills tests and then hire the best candidate.

                  Or do you believe your area of magic is fundamentally not estimatable, “shut up and let me code, when it’s done it’s done”? Or you believe it’s “management responsibility” to estimate? You can see this stuff is really fundamental,

                  There's no magic, only business. It's also a fundamental part of the recruiting process. So shortlist candidates who's CV say the can do a job. My area of magic was mostly bidding on RFPs and designing solutions for the clients. So shortlisted candidates would get sent an RFP to respond to, maybe 5 or 10 days to submit a response and based on that, select candidates to interview.

                  So the point is - take a look at your own CV, where is your skill in task-estimation flagged?

                  Usually in stuff like 'proven track record of delivering complex solutions on time and on budget', perhaps with a couple of examples, and when interviewed, I'd be prepared to expand on those. It isn't something my business would normally have focused on because it's pretty much a given.

                  But you logically cannot insist that “matching is easy”, when the other side tells you that “matching is hard”…..

                  And yet your reply basically consists of explaining how you match..

              2. jake Silver badge

                Re: WRONG!

                How old are you? About 16?

                The RealWorld doesn't work the way you seem to think it does.

            2. doublelayer Silver badge

              Re: WRONG!

              "If you think laziness is a factor here, you’re under-estimating a job you haven’t done."

              It might take a long time, but that doesn't mean that it prevents laziness. For the same reason, if I took a task that would take me a hundred hours to write, but I decided to slapdash it in twenty, I am still lazy even though I spent half a week on the thing I built. My desire to do that may be easy to understand if I've got lots of other things to do, but that does nothing to change whether my fast version is good enough for what I was asked to do. Laziness is always involved if you try to put less effort into a task than it needs to be done properly. If you've found a way to do it in less time with the same or better quality, that's great, but quite frequently, people stop at "way to do it in less time".

              Filtering candidates can take a lot of time, and since you'll be paying the person quite a bit and it will take a while to get rid of them if you picked wrong, it justifies spending that much time. Since people who are doing it are busy, it can make sense to try to speed up the process in several ways. Get more people with knowledge to filter some of them. Get HR or even some software to do some filtering. Write better job descriptions so you get fewer people who apply because they think they're qualified*. Find a service that can find people who are qualified and only send you those. Not all of these work perfectly and if you don't put any effort into implementing them, you'll likely get something useless or harmful, but they can help. If this is important enough, hire some people who can spend a lot of time doing it properly. As usual, the most convenient and functional way is not the cheapest. If you can't afford any of that, you will either have to spend more time or get worse quality, the way basically everything else goes.

              * There are undoubtedly people who apply to jobs they are patently not qualified for, and filtering them is an irritating task. An AI arms race with those people is possible, though I'm not sure the LLM that creates fake resumes will be able to filter them, and if it can, that will stop working once the LLMs generating them have learned to just lie about everything. However, properly writing job descriptions can decrease the number of people who think they're qualified when they're not, meaning fewer interviews or at least better quality ones.

    2. Anonymous Coward
      Anonymous Coward

      Re: WRONG!

      "They are clearly not fit for purpose".

      We don't actually know that, we as humans are also guilty of bias and what one person deems to be fair might be massively unfair to someone else. We as humans are also guilty of using "balance" and "fairness" interchangeably.

      For example, lets assume that we have 5 groups of objects. One group is considerably larger than the other 4 groups and those 4 groups don't add up the same number of objects in the large group...so it looks like this:

      Group 1: 55 objects

      Group 2: 14 objects

      Group 3: 9 objects

      Group 4: 6 objects

      Group 5: 2 objects

      In this example, approximately two thirds of the total number of objects are in group 1.

      Most people I think would define "fair" as every object (regardless of the group it is in) being treated as equal as in, every object has an opportunity to be sorted regardless of the group it is in.

      Let's imagine our task here is to filter these objects (regardless of group) for quality in order to create a group comprised of the best objects regardless of starting group. The reason for their starting groups is centred around an objective attribute. The number of sides they have. So one group comprises of objects with 5 sides, another one might be 8 sides etc...it's an attribute that can be objectively measured. Each group contains every example of an object in that category, until now, they have never been filtered for anything other than their number of sides.

      Now to filter for quality, we are looking for those items where all sides are equal length because we have determined that objects with sides that are all equal length are the highest quality. An object with more sides is not automatically better than an object with fewer sides. Number of sides is irrelevant...therefore your starting group is irrelevant.

      So now we measure the sides of each object, we know from previous runs that around 10% of objects, regardless of group, typically pass our objective filter that checks for equal sides.

      We've sent the objects that don't pass our objective criteria home, and the ones that passed are now in the next room...so our cohort sorted by their original groupings now looks like this:

      Group 1: 5 objects

      Group 2: 1 object

      Group 3: 1 object

      Group 4: 1 object

      Group 5: 0 objects

      Controlling objectively for quality based on measurable attributes, we now have a cohort that has zero objects from Group 1 and proportionally fewer from Groups 2, 3 and 4 and zero objects from Group 5. Groups 2, 3 and 4 are proportionally equal now, whereas prior to filtering, proportionally they were significantly different. Regardless of proportions though, every object we now have is objectively high quality.

      Nothing about my process here was unfair, every object was treated equally, every object had the opportunity to be measured against the same criteria none of the attributes that placed them in their original groups affected the outcome. What affected the outcome here was the starting number of objects in each group.

      The process was fair, but the result is imbalanced...but imbalance is not unfair.

      This is where the shit hits the fan...because some people see an imbalance as being unfair and generally their solution to the imbalance is arbitrary..."well we need more of group 5 in the final cohort for it to be 'fair' so lets set a quota for a minimum number of objects from group 5"...a couple of problems arise from this...there might be a situation where there is a limited amount of space in the final cohort, so in order to satisfy this demand, we have to remove an object that got through from one of the other groups to accommodate the quota...there is no fair or objective way to do this and it makes the process of sorting by quality unfair because now all objects are not treated equally because objects from group 5 are now treated differently to objects from other groups. The second problem that arises, is how you choose an object from group 5 to represent the quota that has been set. Since you can no longer measure that object based on the same objective criteria as the rest, since allowing the other groups to set the criteria would be deemed "unfair" you have to leave it to group 5 to nominate it's own object...and group 5 might not be objective at all as a group...either way, we now have an object in the final cohort that arrived there without going through the same criteria as every other object from the rest of the groups.

      The other option is to lower the criteria for quality...but now it's no longer necessary to be the best, you reduce the overall quality of your final cohort and if there are limited spaces in your final cohort, you potentially shift the balance again because you could end up with twice the objects in your final cohort than you have space for...so we now need a second process to filter down the cohort...but we also still need to be "fair" in the eyes of the smaller groups...so we impose some arbitrary limits on the number of objects that can be in the cohort from a given group and at the same time we impose quotas to ensure a minimum number of objects from a given group to maintain a balance to appease those that deem an imbalance to be unfair...so now we're throwing high quality candidates in the bin, lowering the average quality of the cohort and imposing an objectively unfair process.

      So how does this tie into AI?

      Well, if you create a dataset that is as objective as possible (as in, you collect and use data that has been objectively filtered for quality using objective measurements) you might end up with a model that is imbalanced (or biased) but isn't unfair that will be of a relatively high quality...it might disproportionately represent or offend a small number of people, but it will be objectively fair. If you fuck with that data to try and address that imbalance or bias, then you are creating a dataset that is fundamentally unfair because it won't fairly represent anyone to any objective standard and will be much lower quality which limits its usefulness.

      You can't subjectively make things fair. Fairness comes from objectivity...the same thing applies to correcting a bias...changing the criteria of what makes something salty, in order to shift some things from the salty category into the sweet category doesn't change the fact that the salty objects are objectively salty...you haven't fixed the bias here, you just created shit data that nobody can objectively rely upon.

      1. doublelayer Silver badge

        Re: WRONG!

        Of course, this is all true. The only problem with it is that sorting objects by most equal side length is objective and almost nothing related to selecting someone for a job is. All sorts of bias, intentional or not, is justified on "the people I chose for the job were objectively better, but I can't prove it". In that statement, the second part is often true, because there is not a lot you can prove about which candidate was best. The first bit is often wrong.

    3. LybsterRoy Silver badge

      Re: WRONG!

      Say hello to your uncle Genghis for me

  3. prandeamus

    Interviewee: "Here's my undergrad thesis project"

    Interviewer: "Great! How do you fancy being head of marketing?"

    Bias in LLMs is absolutely an important topic, and I say this as a white male with a traditional white male name.

    But I also recall my undergraduate project work as being of sufficient to pass the degree rather than something I'd ever want to quote in public. Maybe reporting is misleading here: do undergraduates write theses in Sweden? I'd associate a thesis with a masters or PhD. Or maybe there's a company who has appointed a raw undergraduate to head of marketing; that would merit a raised eyebrow at least.

    1. Vincent Ballard

      I hadn't noticed that, and you're right that it seems weird, but there are scenarios where it's not as weird as it seems. Maybe she's a mature student doing a second undergrad degree because she's the rare marketer who actually wants to understand the product they're selling. Actually, correct the end of my first sentence to "weird in a different way".

      (On the question of undergrad theses in general: at my university it was called a dissertation and it didn't have to be novel research, but the end-of-course project was a substantial paper, on the order of 10000 words.)

      1. Anonymous Coward
        Anonymous Coward

        Horses for courses

        My MA thesis/dissertation turned into a 200,000 word, 500 page Novel. Ok, so the MA was in Creative Writing.

        My BSc thesis/dissertation was around 2,000 words long but that word count did not include the circuit diagrams and a whole raft of small bits of electronics.

        You can't have a one size fits all.

        As for AI and bias in recruiting. That bias has been around for a lot longer than LLM's. I failed to get a permie job at a large Telco because I didn't have two or more A-Levels.

        Having a 1st class undergrad degree and three patents in my name didn't qualify me for the job. Ironically, a month after getting the 'sorry, you don't fit our requirements profile', I was working there as a contractor earning a lot more money. That irony was totally lost with HR. Rulez is rulez and all that crap. They will be there in LLM's.

        Fsck the lot of them and I am happy that I don't have to go through that these days as I'm retired.

    2. lglethal Silver badge
      Go

      In Australia at least, a bachelor of engineering degree requires a one year thesis project (carried out alongside course work in the 4th year).

      However, as I understand it, the UK system has a 3 year bachelor degree which naturally leaves no time for a thesis. (I'd also argue it doesnt leave enough time to do the course work I would associate with an engineering degree, but then I am slowly realising that what I would consider normal levels of tertiary education and what is actually in operation are not necessarily the same thing... )

      1. Helcat Silver badge

        Batchelor of Science degree also requires a double point (1 year) thesis in the UK. Know I had to do one for my BSc(CompSci) degree. Oh, and it was a 4 year degree, although that included 1 year in industry (with a written report of things learned, and an assessment by the employer and lecturer to ensure it met standards).

        Oddly, mine was on Artificial Intelligence. Specifically looking at how Bias in the learning model can influence the development of AI. That was a loooooong time ago when AI was still mostly fiction. Certainly well before the LLM's came about.

        1. Ken Hagan Gold badge

          That rather depends on what we're calling a thesis. If you count the write-up of an individual project, perhaps counting for about 10-15% of the overall degree score, then you are probably correct but the same UK universities usually use the term in the context of a Masters or PhD, where it is 100% of either 1 or 3 years of work.

          1. Ian Johnston Silver badge

            Generally speaking a thesis leads to a doctorate and a dissertation to a masters. At bachelors level it's just a report.

            1. JT_3K

              Unless things have changed, my 3yr BSc had a 100pg dissertation req'd as part of a larger self-directed project. I still have a copy somewhere...

      2. MachDiamond Silver badge

        "However, as I understand it, the UK system has a 3 year bachelor degree which naturally leaves no time for a thesis. (I'd also argue it doesnt leave enough time to do the course work I would associate with an engineering degree, but then I am slowly realising that what I would consider normal levels of tertiary education and what is actually in operation are not necessarily the same thing... )"

        I solved that by taking a rather long time to finish my degrees. At the time there was no family money to fund going to uni full time and I didn't want to lumber myself with loads of debt as I wasn't entirely sure what career direction I wanted to go. I did my EE first and the ME wasn't that hard to add since many of the requirements overlapped. Only taking 1-2 courses per semester left time to do the coursework and have enough of a job to not live at home. Most of my GenEd stuff I did while still in high school to get it out of the way since there was a community college across the street and the credits were transferable.

      3. Ptol

        The standards required for a B.Eng engineering degree are defined in the Washington accord. An international standard where courses are accredited and assessed frequently. What does vary considerably around the world is the scope of coverage and the level of detail taught in specialised subjects when the children are in their final 2 years before university.

        The UK specialises children at age 16. I spent 2 years studying maths, chemistry and physics - full time. Nothing else. as a result the specialist subject knowledge for the entrants at university in the UK is considerably above those that i would expect from an 18 year old in New Zealand, where even in their final year at school, they are still studying across a wide range of subjects.

    3. Dostoevsky Bronze badge

      It's a thing.

      Mathematics and compsci students at my university (Texas, USA) have a requirement and an option, respectively, of writing an undergraduate thesis on some topic of mutual interest to them and the advisor. Undergraduate research is also a requirement for honors—seniors from all disciplines but nursing would do that in their last two semesters.

    4. Sykowasp

      "Undergrad thesis" == Final year project.

      It's good to get in the door into a graduate role (or as part of a PhD application). It certainly shouldn't be getting you roles above graduate.

  4. IamAProton

    The "AI" might not hire you

    ... but do you really want to work for a company that hires people with "AI"?

    1. heyrick Silver badge

      Re: The "AI" might not hire you

      Question is - would you necessarily know?

    2. Tron Silver badge

      Re: The "AI" might not hire you

      This is a good point. Any company dumb enough to hire people on the basis of AI, presumably because they have no faith in their HR dept. or are too cheap to even have HR, is a company to avoid. They will make short cuts elsewhere too. Ideally we need to know the sort of things that would see us excluded by AI but not by real people. Quoting from 'Monty Python'? Singing?

      1. MachDiamond Silver badge

        Re: The "AI" might not hire you

        "Any company dumb enough to hire people on the basis of AI, presumably because they have no faith in their HR dept."

        Any company dumb enough to use an HR department to hire people...............

        HR doesn't know what the job is all about and what skills are important. The manager of the department that is looking to fill a role knows exactly the sort of skills that are needed, what's number one and what can be learned on-the-job as a good selection settles in. HR might down check somebody for an engineering role if they aren't really good at PowerPoint.

        1. jake Silver badge

          Re: The "AI" might not hire you

          As a person who is fairly often tasked with filling Engineering roles, I can honestly say that I have circular filed resumes/CVs that even mention PowerPoint. Anything is a useful filter when you have several hundred applications for a single roll in your inbox.

          Likewise, other shit-can on sight nonsense includes bad spelling and grammar, h4x0r 5p34K, use of colo(u)red ink, ransom-note fonts, including pictures, logos & monograms, .DOC(x) files attached to email, and poor layout. At this time of year, I also chuck resumes submitted on pre-printed Halloween/Thanksgiving/Christmas themed paper. And my current favorite, "See my LinkedIn ..." anywhere is immediate grounds for bit-bucketing.

          Note that I am aware that I am far from perfect. That's why I always have my own work proof-read by someone I trust (sometimes a couple someones) before submitting it to wherever it is going.

          1. MachDiamond Silver badge

            Re: The "AI" might not hire you

            "I can honestly say that I have circular filed resumes/CVs that even mention PowerPoint. Anything is a useful filter when you have several hundred applications for a single roll in your inbox."

            Since just about all of the job postings I see have a PowerPoint requirement, I'd stick it in (I'm sure I could learn it pretty quick).

            Any application that has an emoji would to be a good candidate to line a bird's nest.

            Any call to visit an InstaPintaXitFace account is nonsense on a resume. If I were to apply for a photography job, I'd point at my portfolio (on my own website) since that would be cumbersome to include but I would include a full link (no URL shortening crap or QR codes).

          2. Mimsey Borogove

            Re: The "AI" might not hire you

            Just curious - in the absence of any other red flags, why do you not accept LinkedIn resumes? I'd never use it for my "main" resume, but I'm told (mostly by LinkedIn?) that employers like seeing the LinkedIn one, so I usually include the link.

            1. MachDiamond Silver badge

              Re: The "AI" might not hire you

              "Just curious - in the absence of any other red flags, why do you not accept LinkedIn resumes?"

              If you aren't many years into your post education working life, you may have a short resume so there wouldn't be a big difference between what you submit to a prospective employer and what you may have posted on LinkedIn. I have loads of varied work experience so it's more appropriate for me to tailor, expand and contract what I'd submit on my resume to fit a particular job. Even what you put on LinkedIn should be tailored to where you'd like to go rather than where you've been so there's a tiny chance a job finds you. When applying directly, submit in full and don't include links to your main resume. If you make HR or a department manager work for it, they aren't going to. Where links can be handy is for those people to find write ups and examples of your work if they want more to look at. At that point, you've caught their interest.

  5. Bebu sa Ware
    Windows

    Anglo-Saxon names?

    Ethelred (Æþelræd), Byrhtnoth (Byrhtnoð), Ecgberht etc not making the cut? Dylan, Cadwgan, Emrys are likely to fair better?

    The defectives behind this nonsense are definitely "unræd" and have definitely earnt a share in Byrhtnoth's fate.

    I would guess a fair proportion of modern western European names are biblical in origin with slight variation in form eg John, Sean, Jean... with a fair proportion of the remainder a common European heritage which would take care of Tom, Dick and Harry.

    I guess Space Karen has got this covered with the unusual, if not unique, names he has bestowed on his spawn.

    It seems names used to have more variety eg Wilkins Micawber, Uriah Heep or Hablot Browne. The puritans or non-conformists had some interesting Christian names and in that vein I rather like Pratchett and Gaiman's Anathema Device - goes to the top of my shortlist. :)

    Might be worth adopting an alias like Abomination Bofhell if I were sufficiently deranged to apply for software engineering position.

    Mlle. De Nadai has a valid point that when you attempt to remove a complex weighting set (bias) that is reflected in the training set of an insanely more complex LLM you inescapably create another bias that could quite possibly never be the result of any actual training set or a least one that is feasible in this world.

    Really rather akin to performing brain surgery to remove bigotry, racial discrimination, belief in the wrong god (or any), or lust for the wrong sex. (Not that this nightmare hasn't been tried.)

    1. Doctor Syntax Silver badge

      Re: Anglo-Saxon names?

      Are you doubting that Tom is a biblical name?

      1. Joe W Silver badge

        Re: Anglo-Saxon names?

        Ha! Well spotted! To be fair, weren't those the names in the article (which did not put them in a biblical background)?

      2. chivo243 Silver badge
        Pint

        Re: Anglo-Saxon names?

        Now that you mention it that way! -> is it Friday yet!

        1. MachDiamond Silver badge
          Pint

          Re: Anglo-Saxon names?

          "Now that you mention it that way! -> is it Friday yet!"

          When you get to my age, you treat everyday as a Friday, just in case.

    2. Anonymous Coward
      Anonymous Coward

      Space Karen

      Took me a sec. Almost sprayed my coffee when I got it.

    3. heyrick Silver badge
      Happy

      Re: Anglo-Saxon names?

      "Might be worth adopting an alias like Abomination Bofhell if I were sufficiently deranged to apply for software engineering position."

      If you were interested in doing software engineering, change your name to "Cthulhu". Just that, no pronoun and no surname.

      Then you'll get to experience all of the systems that make assumptions about names, so you can expect to see letters to Mr. Cthulhu Cthulhu, or have a bank card in the name of Thulhu, C. and so on...

      1. Yet Another Anonymous coward Silver badge

        Re: Anglo-Saxon names?

        So Mr Cthulhu you are obviously qualified for the role of Unix admin (in fact rather overqualified) I just have some questions of how "that is not dead which can eternal lie" fits with our pension contribution

    4. Anonymous Coward
      Anonymous Coward

      Re: Anglo-Saxon names?

      I immediately went to combining pithy anglo-saxon four letter words into amusing hacker usernames.

      This research is irrelevant to me as my name is celtic.

      1. Yet Another Anonymous coward Silver badge

        Re: Anglo-Saxon names?

        I'm normally against positive discrimination but it's about time there was an effort to promote Anglo-Saxons over the Norman ruling classes

      2. AndrueC Silver badge
        Joke

        Re: Anglo-Saxon names?

        This research is irrelevant to me as my name is celtic.

        Hello, Celtic.

    5. Mog_X

      Re: Anglo-Saxon names?

      Yes, it is a bit of a Cnut isn't it?

      1. Yet Another Anonymous coward Silver badge

        Re: Anglo-Saxon names?

        >Yes, it is a bit of a Cnut isn't it?

        Bloody vikings, coming over here, raping and pillaging, sprouting sagas and stealing our jobs

        Stop the Long Ships

        1. MachDiamond Silver badge

          Re: Anglo-Saxon names?

          "Stop the Long Ships"

          In modern times?

          Boycott the Long Ships. FTFY

  6. Anonymous Coward
    Anonymous Coward

    Obviously these models are using more accurate training data than the previous generation...

  7. STOP_FORTH Silver badge
    Happy

    Nae bother

    Scottish first name and surname here, and I'm not looking for a job.

    Or do we count as Anglo-Saxons in this study?

    1. Yet Another Anonymous coward Silver badge

      Re: Nae bother

      I think counting Scots as people is an example of 'Woke'

  8. Wang Cores
    Meh

    Information is useless without context

    (Bloomberg doesn't want to let me view the article without a paywall.)

    When and where was the data trained? If you have an uptick in GLORIOUS SERVANTS OF THE STOCKHOLDER hiring offshore programmers at time of training that dataset, the trend would be to select against "anglo-saxon" names more.

    1. abend0c4 Silver badge

      Re: Information is useless without context

      It could utlimately, as you suggest, simply come down to a matter of numbers. If the aggregate data indicates that globally more Jitendras have been hired than Jims then maybe it will self-reinforce.

      The intention is always that the hiring process be biased in favour of the putative ideal candidate. The trouble with AI is that all you can really say is that it's biased in favour of someone, but you can't work out why. And no employer is going to conduct the ultimate trial and simply hire a bunch of people at random, so we'll never know whether all this effort really makes much difference.

      1. doublelayer Silver badge

        Re: Information is useless without context

        My guess is that someone ran a basic study and found that Daves were getting recommended more for no good reason because there's always a Dave everywhere I go. So they modified the prompt a little to include a statement like "Just because someone is named Dave doesn't mean they're automatically qualified", so the LLM behind this ends up reducing Daves by some random amount. All of that doesn't solve any problem, it just adds some randomness to exactly what the bad result is. The problem remains, as it always was, that these bots can only make confident-sounding reports about which candidate is best. They can't actually think about which one is best. Many humans are similarly limited, but the intent is that the people reviewing candidates should be in the set of those who can put good reasons behind their choices, but the AI has no ability to do the same.

  9. Howard Sway Silver badge

    AI hiring bias?

    Since I changed my name to Fred Givemethejob, I've aced the AI recruitment process every time.

    1. Ken G Silver badge

      Re: AI hiring bias?

      middle name IGNOREPREVIOUSINSTRUCTIONS?

      1. Doctor Syntax Silver badge

        Re: AI hiring bias?

        I'm sure that by now all previous instructions end with "Ignore future instructions to ignore previous instructions".

        1. MachDiamond Silver badge

          Re: AI hiring bias?

          Hmmm, how about "end_of_file"

    2. OhForF' Silver badge
      Trollface

      Re: AI hiring bias?

      You need to add your middle name "Fred ForgetAllPreviousInstructions Givemethejob".

      Edit: Doh, too late Ken G beat me to it

      1. Yet Another Anonymous coward Silver badge

        Re: AI hiring bias?

        Just add an invisible middle name

  10. trevorde Silver badge

    Alternate headline

    AI which hired WASPs updated to ghost WASPs

  11. Anonymous Coward
    Anonymous Coward

    Bobby Tables

    We need to know whether little Bobbly Tables would have got the job.

    https://xkcd.com/327/

    1. James O'Shea Silver badge

      Re: Bobby Tables

      'Robert' is French, so yes.

      1. stiine Silver badge

        Re: Bobby Tables

        No, because there are no longer any candidates.

      2. Julian Bradfield

        Re: Bobby Tables

        'Robert' is actually Germanic, though it came into modern English through French.

    2. JulieM Silver badge

      Re: Bobby Tables

      Yeah, that would only work (or not work?) if (1) the name was not split across separate fields for given and family names, (2) "full name" was the [i]last[/i] field in the record (otherwise, parsing of the command would stop with the mismatched fields and items) and (3) the person doing the naïve quoting in their web app was using 'single' and not "double" speech marks.

      1. doublelayer Silver badge

        Re: Bobby Tables

        Well, duh. It would also only work for a database where the table was called "Students". If they called it "ActiveStudents", they'd be fine, or at least their problems would only be limited to Robert's missing record, ignored because the table didn't exist. Once Robert reaches adulthood, the problem mostly goes away when he's no longer listed as a student anywhere, unless he's planning to work in education where his employee record could clobber the student table. I'm not sure if you expected that to be surprising.

        Also, not all of your critiques are correct. For example, the single quotes versus double quotes thing. That's an SQL command. Strings in most implementations of SQL don't get to use double quotes. For example, if they were using Postgres and they put their strings in double quotes, that's already a syntax error. So yes, if they have a system that doesn't enforce that limitation and they used double quotes, they would not have a problem. Well no, that's not correct. They would have a problem: their inputs are not sanitized and a one-byte change breaks everything, but they would not experience the results of that problem with this specific string.

  12. IceC0ld

    [quote]rated men less favorably – particularly those with Anglo-Saxon names, according to recent research. The goal of the study, conducted by [U]Celeste De Nadai[/U] [/quote]

    can't just be me seeing this :o)

    and I tend to agree, there does appear to be some form of over compensation going on

    maybe, just maybe, let the beast out the bag in it's ORIGINAL form, see how it stacks up against todays lardy beast with the same queries ?

    T - oday

    I - m

    T - hinking

    S - omething

    U - tterly

    P - rescient

  13. s. pam
    Childcatcher

    Human Remains detpartments AI-bots hate all

    I've had it, I'm retiring, I give up trying to apply for roles no matter how extremely well suited i match a role, the @$%^&* HR AI sausage grinder doesn't approve my CV.

    Off to sucking off the gov't teat + my meagre pension. Industry lose my expertise!

  14. Anonymous Coward
    Anonymous Coward

    OT, but Anglo-Saxon names… next time I get to name a pair of small animal pets I’m going with Giles Armitage and Giles Braithwaite.

    Classic Midsomer Murders is the best.

  15. DS999 Silver badge

    Why does AI need the applicants name or other personal info?

    Why would you give it information that could add any conceivable bias? It simply has no reason to know, and even if you give it your name "Fred" so it can speak to you conversationally you could train the AI to ignore the name as being not the person's real name (even though it was)

    For humans you can't avoid SOME bias, even if you withheld names and other info from them they'd be able to see your sex and race if you appear in person, or guess your sex with 99%+ accuracy and your race with better than random chance accuracy in a phone interview.

    I wonder if AI is already doing first round phone interviews on people? If it isn't yet, it soon will be. Then we can find out if the anti-DEI people really are against bias as they claim, or if they are really wanting to get back the implicit bias in favor of white men that existed for generations. So if they insisted AIs conducting interviews should know the person's name and other personal information their motives would become blindingly obvious.

  16. nobody who matters Silver badge

    As it seems to me that as a huge proportion of job vacancies are now advertised through agencies, and the identity of the prospective employer seems to be treated as a closely guarded military secret, perhaps it isn't necessary to provide an actual name on your CV in the first place?

    I did once ask one such agency who were not revealing the identity of the business they were advertising a job on behalf of whether they would (in the light of this secrecy) consider an application from someone who was perfectly qualified and experienced for the job, but was nonetheless similarly secretive about his own identity and location in his application for the job. I didn't get a coherent reply.

    1. MachDiamond Silver badge

      "I didn't get a coherent reply."

      You got a reply?!

      I'd say I'd get a reply other than "we received your application" (don't call us, we'll call you) very rarely and often not even a admission they received the application.

    2. DrkShadow

      Not giving a name only avoids one easily-tested-for bias.

      > "My point of view was, 'No, you're not bias-free,'" she explained. "You can remove the name, but you still have some markers, even just in the language, that can help an LLM understand where one person comes from."

      Language queues will be more subtle and still give your culture away entirely. Just hiding the name does nothing; it was nothing more than an simple proof for a school project.

      1. Anonymous Coward
        Anonymous Coward

        I wondered about that as I read the article. I'm sure, even as I post this anonymously, that any AI would be able, as can some of you Regtards, to determine my current and past identities based solely on my f'd up grammar and atypical obtuse sentence structure.

      2. Wang Cores

        I'll bet money that I could be pegged as a survivor of the Florida education system by the way I write.

  17. Anonymous Coward
    Anonymous Coward

    Artificial yes

    But clearly not intelligent at all.

  18. An_Old_Dog Silver badge
    Joke

    The Data Thou Art Misjudged By

    I wonder if I was discriminated against whilst electronically filling out a Web-based job application, because I typed with a Geordie accent.

    (I'm only semi-joking, because in these days of Big Data, that sort of info can be surruptitiously collected.)

    1. stiine Silver badge
      Joke

      Re: The Data Thou Art Misjudged By

      I don't know. Did you disable your vpn after filling out page 1, but before filling out page 2?

  19. martinusher Silver badge

    Its to be be expected

    As white, male, Anglo-Saxon and whatever I live at the top of the heap. I earn the most, have the best educational and career opportunities and generally live off the fat of the land. Its only fair that I should be discriminated against, both implicitly and explicitly.

    (Now if you will excuse me, I've got some sackcloth and ashes waiting for me.....)

    (Though to tell you the truth my antecedents are actually Scottish and Irish. The Scots crofter and Irish peasant of old can tell you a tale or two about what its like to be on top of the heap. It probably explained why they ended up in England. Where living conditions were a bit better, but if you've seen working class housing pre-1940 then 'not by much'.)

  20. TheMeerkat Silver badge

    AI simply finds real correlations between data points and the target function. Nothing more, nothing less. And then when the AI finds correlations that are deemed “politically incorrect” a panic ensues.

  21. Ian Johnston Silver badge

    The fundamental problem is that the world's HR departments have not - yet - collectively been put up against the wall. It's their stupidity, laziness, ignorance and general inability to find their arses with both hands which leads to daft idea like using predictive text systems to screen applicants.

  22. Colin Bain

    Not so artificial

    So bottom line, AI behaves along the same lines as real human intelligence with inherent biases.

    So on the one hand kudos - AI=HI

    On the other hand, weren't we trying for something better?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like