back to article Mamas, don't let your babies grow up to be coders, Jensen Huang warns

Nvidia CEO Jensen Huang believes AI has advanced to the point at which it's no longer necessary to prioritize computer science and coding education for the world's youth. Apologies to the past decade of CompSci grads, but your college years would have been better spent gaining expertise in areas like science, manufacturing or …

  1. mhoneywell

    Hmmmm

    I much prefer a world where people can look under the covers and understand what's going on, rather than leaving it to AI. That's what a healthy relationship with AI would look like.

  2. ChoHag Silver badge

    Computer programmers will be replaced by software? Where did I hear that before...

    I'll be keeping my text editor, thanks.

    1. Joe W Silver badge

      Computer Science is not (only) programming. And judging from ML generated output so far, I don't think we should just trust the system to give us correct results. I would say that journalists (who hopefully do check sources!), developers (and that includes much more than just writing code, ffs) and scientists so far are pretty safe.

      Learning to think has never been a bad thing. I don't write(a lot of) code any more, but it certainly gives you an insight on how to break problems into smaller pieces while not losing sight of the bigger picture. Same with hand tool (woodworking is most accessible) skills and needlework. Learn a bit of both. It is not only fun, but does challenge your brain in a wonderful way. My kids can sew and saw in a straight line.

    2. DS999 Silver badge

      That's not what he said

      He said CODERS would be replaced by software.

      That I can believe (eventually, some future generation of AI well beyond the current one) because that's basically rote translation of a concept into code. That's easy, a lot more people can do that than the truly difficult part of programming. That is, gathering requirements and creating a spec for what the program is supposed to do. That's the really hard part, and AI won't help you there - but he wasn't saying it would.

      But assuming some humans have done that and produced a detailed spec saying exactly what the program should do and how it should react to all the various inputs and circumstances then yes I could easily see some type of automation take over and translate that into whatever computer language you prefer.

      It won't do what you want, because you'll have got the spec wrong, but the one improvement of this method of programming over the current one is that instead of making code changes trying to fix the shortcomings/errors in the spec (i.e. do what the end user WANTS rather than what he told you or what he thought he wanted until he saw it in action and realized that wasn't what he wanted after all) is that after fixing the spec you can have the automation generate new code from scratch. Basically the spec (which might be part English, part flowcharts, part mathematical formulas, part government laws and regulations, part APIs from third parties, and who knows what else all combined into a toxic witches brew) becomes the source code.

      I've long thought one of the reasons there are so many computer languages is that academicians think programming is writing code, so they think a language that makes that code more "elegant" will be better. They don't understand what real world programming is.

      1. Joe W Silver badge

        Re: That's not what he said

        Ah, yes.

        You are correct.

        Just soooo many people confuse the two. I assumed that ... id-ten-T was one of them (still 90% certain)

        The academics are not programmers. The academics are more theoretical and do get the algorithm part of computing science. They are not at fault for the bajillion different languages. And, to me, all languages in a certain group are same-ish. C, FORTRAN, R, Basic, Pascal all are procedureal. Then there's object oriented, and finally the weird Lisp stuff. Most things that divide languages in a group is syntax and some details (like row major / column major, pass by value / pass by reference, which is important, of course, but not enough to make them oooh so different).

        1. Bebu Silver badge
          Windows

          Re: That's not what he said (Jensen's device ;)

          《pass by value / pass by reference,》

          And of course pass (or call) by name [Algol 60]. (And Jensen's device. :)

          I imagine lambda expressions serve the same purpose in the fashionable languages to day. :)

          When you think about it why would AI coders need high level languages they could just dive into the instruction set architecture and produce binary directly. :) They could fiddle with the microcode in processors families that allow that (Alpha PALcode?) I assume global optimization would then be a doddle.

          Anyway I think he is likely right but probably not for the reasons he thinks.

          I.T. was the "steam" of the age of railway mania or the "electricity" of middle 20th century. During those two periods steam and then later electricity were used for all sorts of ludicrous purposes - some just plain idiotic, stupidly dangerous - which have passed into well earned obscurity. I.T. is currently used for similarly ludicrous purposes which will also vanish.

          The next hundred or so years are going to pose such challenges that much of the worse than useless contemporary bullshit will be discarded very rapidly.

          Unsurprising that the manufacturer of the hardware of which AI/LLM needs truckloads, to run on would be boosting AI [MRDA.]

      2. CrackedNoggin Bronze badge

        Re: That's not what he said

        but the one improvement of this method of programming over the current one is that instead of making code changes trying to fix the shortcomings/errors in the spec (end user spec) is that after fixing the spec you can have the automation generate new code

        If that were true the end user wouldn't need you as a middleman. That's what CEO's dream of, and promise their activist investors in exchange for larger bonuses. From that POV, AI is the new outsourcing, a good excuse for getting rid of or demoting anyone who isn't in business and sales. How's that going, Boeing?

        Basically the spec (which might be part English, part flowcharts, part mathematical formulas, part government laws and regulations, part APIs from third parties, and who knows what else all combined into a toxic witches brew) becomes the source code.

        AI is nowhere near that. Where it is now is another iterative improvement. Hard wiring replaced by punch cards, replaced by microcode, replaced by complied languages, added to by interpreted languages, enhanced by smart editors. AI is another programmers tool. It's limitations are just as apparent as it benefits. For those limitations to change by another few orders of magnitude is going to have to wait until the next hardware revolution, because of resource barriers.

        ... eventually, some future generation of AI well beyond the current one ... Maybe - meteor strikes, etc. permitting. But fuzzifying what is with what might be is not engineering, it's entertainment or dreaming (fine), or a psuedo-religion (ugh), or in the case of Jenson Huang and some others, good sales talk (expected).

        1. DS999 Silver badge

          Re: That's not what he said

          If that were true the end user wouldn't need you as a middleman

          The end users can't write a spec - they don't know what they want. Someone has to tease out of them what they actually want/need versus what they think they want. Someone has to ask all the right questions. If you asked the end users to write a spec today for replacing an existing/legacy system (which is what most programmers do these days) they would say "I want it to work just like the current system does". Some of them might throw in a few things here and there "I wish the screen that let you choose shipping addresses remembered which addresses had been used with which people so that the options I'd actually use would be at the top instead of alphabetical order" but mostly they'd just want things to work exactly how they have always worked. How is an automated system going to know "what the current system does"? That's where you start asking a bunch of questions, watching how people work to see what steps they follow and you build flowcharts and so on.

          Writing something new is even more impossible to automate, because the potential userbase has no reference point. That's where you usually get the worst designs, because programmers (who have no idea how the people who will be using the system do their job or what would be most efficient) design something that makes sense to them. Then they get all frustrated and talk about "stupid users" because the end users understand the new app's workflow that seemed "obvious" to their on-the-spectrum brain.

          No, automation will never replace that part of programming until we get a true AGI.

        2. 0laf Silver badge

          Re: That's not what he said

          "AI is the new outsourcing"

          I shall use that.

      3. TH*

        Re: That's not what he said

        OK, so you change the spec to fix your bug and regenerate the code from scratch. And now you you can throw away your carefully collected and filtered bug list because those bugs are gone and a bunch of new ones have popped up!

        Fix the next bug in the spec, regenerate the code again and.. Oops, same thing happened! Bin the list, start QA again, repeat. Possibly forever.

        Software always starts wrong, and (sometimes) eventually becomes good through a process of incremental improvement. I don't think regenerating code in a non-deterministic way from a spec document can really be a good substitute for that process.

        The other thing I notice in your example is that the spec now has to be massively more detailed and precise than a spec has ever been before - probably so dense and complex that a non-technical person cannot understand it. Now it's.. the code!

        1. DS999 Silver badge

          Re: That's not what he said

          I don't think regenerating code in a non-deterministic way from a spec

          Who said the process should be non-deterministic? The one requirement I would make of that "AI" is that it perform the same process so that it DOES result in the same output every time. That way if there's a bug "when the monthly spending is over $30,000 you need to use a different tax form" then you include that in the spec and next time that bug will be gone. You seem to be assuming that fixing that one item in the spec (and the spec isn't a "document", I thought I was pretty clear about that) will result in something else breaking just because that's sometimes how it happens with patching computer code.

          I don't think that necessarily follows at all - if it does then you weren't precise enough about your change to the spec to fix the "bug". That's why humans need to write the "spec" (or "collect it", in the case of stuff like applicable laws/regulations that have been written by others) because they alone can understand exactly what the "bug" is and what requirements need to be changed to address it. If fixing it breaks other stuff, then either that part was screwed by up a human or the person reporting the bug did not get the bug report correct - it is either only a bug in certain cases or is a broader bug than believed.

          1. TH*

            Re: That's not what he said

            I don't think you'll ever get a fully deterministic "AI" as a service, especially one that remains so through updates. I suppose you could run one model per project on your own servers, never alter them in any way, don't allow any randomness as part of the system, and really hope it has no bugs of its own that need fixing.

            However as soon as you change your prompt (spec) if it's generating "from scratch" it'll create a different thing than it made last time. So I think perhaps what you want instead is to make your specs immutable, and every change is applied with a "patch" spec (i.e. a change request) that the AI can use to make a modification to the existing program. If all this is really deterministic then it would be possible to generate the code from each successive patch in turn and end up with reproducible results.

            (Also, c'mon, it's a document. Even if it's got pictures, and equations, and emoji, and a video, it's a document. We're in an HTML document right now.)

      4. katrinab Silver badge
        Megaphone

        Re: That's not what he said

        Your "detailed spec saying exactly what the program should do and how it should react to all the various inputs and circumstances" *is* your computer program.

        It sounds like you are just inventing a new programming language. It might be better than existing ones, but it is still a programming language.

        1. DS999 Silver badge

          Re: That's not what he said

          It isn't a programming language if it doesn't have any spec. I purposely listed a lot of unrelated stuff (English text, flowcharts, mathematical formulas, laws, etc.) to indicate that it is NOT another language. There is no fixed format for what the spec is. That's not a "programming language", any more than a random assortment of letters and numbers is a "book".

          If the "AI" truly is as capable as Huang wants us to believe, it will be capable of turning that disjointed set of information into computer code. Because that's what "coders" do after all.

          1. doublelayer Silver badge

            Re: That's not what he said

            Yes, that is a computer language, just one that has a lot of syntax options. There is a lot of that in languages already.

            English text: "if 8 in list_of_numbers then"

            Mathematical formula: Well, duh.

            Flowcharts: I don't like those languages much and they're most often used to try to teach programming, but they exist.

            Laws: You can drop one in. It will be your fault when the AI doesn't properly encode legalese into logic appropriate to your situation, since even a programmer nowadays requires a lawyer or two to do that correctly.

            I get what you're trying to say, but a program that reads a certain form of input data and produces a program is just a compiler. The syntax it understands can become more and more complex, but if it's deterministically taking an input text and producing an output, it's parsing a language. In order to be deterministic, that language will have to have certain limitations on it as well. I'm not sure the distinction is as clear as you think it is.

            1. katrinab Silver badge
              Coat

              Re: That's not what he said

              Mathematical formula:

              8 ∈ list_of_numbers

        2. Roland6 Silver badge

          Re: That's not what he said

          Like to see an AI code OSI Transport Class 4 from the Standard specification; human programmers struggled to implement it ..

          Remember a specifcation (in general) says what needs to be done, not how it will be done in a specific instance.

          Would the AI realise its code needs to interface to (another AI’s) ISO OSI CLNS and utilise the iinformation contained in the OSI Transport Class 4 Service Definition?

      5. vtcodger Silver badge

        Re: That's not what he said

        "That I can believe (eventually, some future generation of AI well beyond the current one) because that's basically rote translation of a concept into code."

        Color me skeptical on that. The problem is that in my experience expressing concepts completely and unambiguously is even harder than writing perfect code. My guess is that a lot of folks will learn -- painfully and a great cost -- that actualizing their visions will require not Artificial Intelligence, but Artificial Clairvoyance. I don't think any number of Nvidia chips are going to be able to deliver that.

      6. Ian Johnston Silver badge

        Re: That's not what he said

        that's basically rote translation of a concept into code.

        Which is what a compiler does, after all ...

        1. DS999 Silver badge

          Re: That's not what he said

          Which is what a compiler does, after all ...

          Not at all. A compiler has very explicit syntax rules for what it will accept, and allows no deviation from those rules. It does not accept a "concept" as input, a human must translate that concept into the very explicit syntax first. i.e. a compiler will not accept "sort the list of helicopter parts by vendor name and within that larger search order the list of parts for each vendor by part number" as input.

          You need to translate that concept (which is "spec") into C (or whatever) language following the very explicit rules for C - from everything such as using the correct data structures that contain that list of helicopter parts all the way down to including a ';' after each line.

          1. doublelayer Silver badge

            Re: That's not what he said

            That's just syntax, though. There are languages that can express that sort in fewer statements, requiring less knowledge of the internals, and obtain the same result. A search like that in SQL looks a lot like your sentence, except that you'd use specific column names instead of things like "the vendor" and "part number". Something could take a sentence that looks natural and translate that into the sort you want. That is just a compiler that uses a language that's closer to English. I've seen a few of those, though I tend not to like them.

            What you're describing is a bit unclear. It's easy to imagine a program capable of understanding a broad concept: "I want a program to translate text from one human language to another supporting at least English and Swahili", going out and building such a thing. It's not easy to imagine how you'd actually obtain it, but that's what the hype around AI appears to be doing. Meanwhile, your version appears to involve someone being much more specific about how the job gets done, but doesn't end up bringing back all the restrictions that being that specific tends to entail. You're clearly not saying that you can have a one-sentence spec, but I'm having trouble figuring out how your spec and a program differ except that more of your spec looks like English.

          2. Ian Johnston Silver badge

            Re: That's not what he said

            a compiler will not accept "sort the list of helicopter parts by vendor name and within that larger search order the list of parts for each vendor by part number" as input.

            Whether you put it that way or

            sort(HelicopterParts, "Vendor", "PartNumber");

            or

            "rendezze a helikopter-alkatrészek listáját a gyártó neve szerint, és a nagyobb keresésen belül rendezze az egyes szállítók alkatrészlistáját cikkszám szerint" (thank you, Google translate - I know no Hungarian)

            is merely a matter of syntax. You're still using teh compile to go from concept to code.

            1. Anonymous Coward
              Anonymous Coward

              Re: That's not what he said

              English, and other human evolved languages, have been proven time and time again to be not precise enough to describe complex operations and calculations. That is why we invented mathematical notation and programming languages. Even if AI could write code, rather than just copying snippets of human-created code, you still need a precise way of communicating the spec to the AI, and that in itself is a programming language.

              1. Roland6 Silver badge

                Re: That's not what he said

                Back in the 1980s some experts were observing that Vedic-Sanskrit with its precision, had potential for computer and AI applications…

      7. druck Silver badge

        Re: That's not what he said

        He said CODERS would be replaced by software.

        That I can believe (eventually, some future generation of AI well beyond the current one) because that's basically rote translation of a concept into code. That's easy, a lot more people can do that than the truly difficult part of programming.

        No, it is not rote, and while it might be easy to do it badly, it requires some skill to do it well.

        I've seen code knocked by outsourcers from the sub-continent, and while it seems to meet the requirements, it was a horrific mess of bloated cut and pasted code, completely unmaintainable and inefficient. We ended up not using a singe line of it. The code produced in house, was elegant, efficient and maintainable.

        When you use an AI which code do you think the training set will contain? Vast quantities of low quality shovelware, or a much smaller set of well written code?

    3. Yorick Hunt Silver badge

      I remember back to at least 1981 or 1982, when software called "The Last One" was released, claiming at the time that you'd never need a programmer again.

      1. unimaginative

        It goes back to well before that. That was the aim of Cobol and its predecessors in the 1950s.

      2. Ian Johnston Silver badge

        I remember reading about that in - I think - PC World. And now I find that it even has its own Wikipedia article.

    4. StrangerHereMyself Silver badge

      I've been hearing the same story for 40 years or more. I can still vividly remember my dad telling me not to become a programmer in the early '80's when he showed me an article on so-called Expert Systems. People would tell them what they wanted and it would produce a complete working program as if by magic. We all know how that fanned out.

      I believe there's a deep desire among companies to replace these expensive and obnoxious software developers, as we can see with all the hypes (Expert Systems, Lo-Code, AI). It will eventually become reality but only when computers have matched human beings in intelligence. And I believe we'll have different problems on our mind when that happens!

    5. Ian Johnston Silver badge

      Computer programmers will be replaced by software? Where did I hear that before...

      When real coding was replaced with high level languages which let you specify in what you want to happen and a backroom system decides how to get a computer to do it?

      1. werdsmith Silver badge

        I remember seeing a job advert once, asking for skills in using RAD tools.

        Haven't seen one like that for a long time.

        1. katrinab Silver badge
          Meh

          Because most stuff these days is web front-ends and there aren't really any RAD tools for that.

          OK, there are some, like PowerBI and Tableau, and lots of people do use them.

  3. Anonymous Coward
    Anonymous Coward

    Coding was the first thing outsourced offshore decades ago. It hasn't been a good career choice for half the world for ages.

    1. Martin
      Happy

      I think perhaps the offshore coding teams might be in danger from AI (though I'm not holding my breath) - but the people who supply the specs and did the design for them will still need jobs.

      And there are plenty of "coding" jobs in this country which also involve a bit more than just being a code monkey. But you've still got to know how to code.

      So ignore this doomsayer from Nvidia. Go forth, young person, and learn to code. Despite the comment above, there are still plenty of jobs in tech, and still not enough people to do them.

    2. captain veg Silver badge

      Re: Coding was the first thing outsourced offshore decades ago

      ... and the results have been... er, mixed.

      -A.

  4. Dr Who

    The coders are dead! Long live the coders!

    Coding is the process of telling the machine what to do. First however you have to understand what and how you want to tell it, and that's the tricky bit.

    There is a new IT discipline called Prompt Engineering, which people are making a lot of money out of. Prompt Engineers develop the natural language questions / instructions you enter at the AI/ML prompt. They can even develop template prompt texts which you can vary certain parts of (you might call these variables user input). All of a sudden, Prompt Engineers look rather like software developers.

    1. Dom 3

      Re: The coders are dead! Long live the coders!

      "Coding is the process of telling the machine what to do."

      Good code tells the next human to look at it (or yourself

      a few months down the line) what it is you are telling

      the machine to do.

  5. Moldskred

    Isn't it funny how they never seem to think that business management can be replaced by LLMs?

    1. unimaginative

      Actually, some layers of management are good candidates for replacement by LLMs. I suspect others already have been replaced by software: e.g. I know software decides routes in logistics, that must have been at least partly a junior to mid management decision before that.

      There have been computers doing financial trading for a while too.

    2. heyrick Silver badge

      Given some places I've been, you could simply get rid of half of them and the only difference would be less snouts in the trough...

    3. StrangerHereMyself Silver badge

      Or politicians :)

      1. Androgynous Cupboard Silver badge

        Well I'm damn certain that people that attempt to predict the future of the industry can be replaced by LLMs. In fact I'm not entirely sure that hasn't already happened.

  6. Yorick Hunt Silver badge
    Mushroom

    Yeah, don't let them grow up to be programmers...

    ... Let them grow up to be gamers, to provide a stable income stream for nVidia.

  7. OhForF' Silver badge

    >"The technology divide has been completely closed."<

    Dunning Kruger in full effect?

    If you don't know how to interpret the code the LLM comes up with how will you spot it when it provides something that is likely according to the training data but still wrong?

    1. Rafael #872397
      Devil

      Re: If you don't know how to interpret the code the LLM comes up with

      You could ask another LLM what's wrong with it. With the proper tools (and prompt engineering!) you could even let them talk between themselves, discuss the code, fight about indentation and language preferences, disagree about variable names, and just stop working together because their partner is an a**hole. Just like real meatbag programmers!

      Then you can add a third LLM to act as the group's leader and a fourth as project manager and see what happens. It'll be LLMs all the way down!

    2. Ian Johnston Silver badge

      How often do you check the output of $COMPILER?

      1. pip25
        WTF?

        How often is the output of a compiler unpredictable, including changes between runs, using the same input?

      2. doublelayer Silver badge

        Only when I've debugged it for weeks and have started to come to the conclusion that, maybe this time, it's not my fault. It turns out that it usually still is my fault. The reason I can do this is that the compiler gives me the same result each time I compile the same code, so it is possible to change something, see if it helped, and change it back if it did not.

        And sometimes, it is the fault of the tools. While I tend not to find many compiler bugs, I have found and fixed bugs in standard libraries before. It starts the same way with me assuming that my code's wrong somehow and I can't see it, and I start to read the library code to see what it is doing with my input because I can't see it. Sometimes, I notice that it's the one doing the wrong thing.

  8. Anonymous Coward
    Anonymous Coward

    At the moment machine learning makes everyone a questionable programmer, depending on how well they can find flaws and hallucinations in the generated code. There may also be copyright issues.

    1. _olli

      Never before have so many been able to write code that they can't debug.

  9. Howard Sway Silver badge

    the upskilling process, I believe, will be delightful

    So will the contract rates for skilled programmers who get desperate phone calls to come in and sort out the total fucking mess made by anybody stupid enough to actually buy into the idea that he's selling.

  10. Anonymous Coward
    Anonymous Coward

    Don't train just one skill

    I have literally been hired as a programmer for my ability to drive a counterbalance forklift truck so the company didn't have to hire someone for one hour per month.

    1. Martin
      Happy

      Re: Don't train just one skill

      And you probably had more fun in that one hour per month than you did in the rest of your job!

    2. Boris the Cockroach Silver badge

      Re: Don't train just one skill

      Ironic really, as I was hired to drive the forklift , yet seem to have ended up doing the programming.

      But can AI help me to program?

      well I can describe toolpaths and robot motions to it... but I'd have a hard time convincing myself the code spat out is correct (I have a hard time with the output of the CAM package)

      But if you have to be frighteningly specific to the AI is describing what you want it to output , how is that different to regular programming as high level langauges are merely away of abstracting the lower level code into more compact yet complex commands.

      EG

      Add 1 to variable counter and store the result in counter.

      C=C+1

      Inc C

      All equally valid and yet a lower and lower description of what you want done(I'd put it in binary too but cant be arsed to look up the op code )

      Until AI gets the ability to describe a problem given to it, its just another programming tool and more than likely currently, an unreliable programming tool

      1. Roland6 Silver badge

        Re: Don't train just one skill

        Your point raises interesting questions:

        Why does the AI need to produce high-level code that goes into a compiler etc.

        Who is going to baby sit the AI: feed it the specification, taking its output to feed into the compiler, take that and run the test suite etc….

  11. Fred Daggy Silver badge
    Pint

    IT is the art ...

    IT (Programming, Infrastructure, Security ... everything) is the art of giving the end user/customer what they need, despite being told what they want.

    1. Anonymous Coward
      Anonymous Coward

      Re: IT is the art ...

      Yeah. I've been at the receiving end of that attitude far too often. Programmers who think they know better than I do what my work involves and how I ought to do it. See also: The Post Office, Birmingham City Council, the NHS ...

      1. doublelayer Silver badge

        Re: IT is the art ...

        Sounds good, as long as you're willing to be the kind of person I've seen as a programmer who tells me not to tell you why I think this is a bad idea, I build exactly what you said to, and then the game of who to blame for it not working right starts up. Programmers can easily misunderstand what is needed, but usually, it isn't because they got a clear answer and rejected it. Usually, it's because their assumption on what a vague spec means was wrong. That is why, whenever possible, I spend a lot of my time making sure that I understand what you need and you understand what I can do and what I'm currently planning to do. Skipping or rushing that process usually produces code that neither of us wants to see.

  12. Pascal Monett Silver badge

    "Jensen Huang believes"

    I'm sure he does. He's an electrical engineer, not a programmer. Obviously, he thinks that, as long as you can find some code on the Web, you don't need a programmer.

    Unfortunately, there is no pseudo-AI that is going to code Salesforce, or SAS, or an ERP. I don't care that there are now "prompt engineers", you still have to know where to place the prompt in the code.

    Programmers are not going away any time soon.

    1. Flocke Kroes Silver badge

      Re: "Jensen Huang believes"

      Let's see him put nVidia investors' money where his mouth is: ML generated video drivers!

    2. Yet Another Anonymous coward Silver badge

      Re: "Jensen Huang believes"

      On the other hand you could have said in the 60.70s that all these computers are going to need millions of people to become electronics engineers to build them.

      There was even a scare in the 60s that if computer use took off at the current rate more than 100% of the population would be needed to do PCB layout ( then done by hand)

      I can see the armies of in-house programmers all producing the same sales forecast and inventory reporting routines in SAP/Oracle being the first to be replaced by AI.

      Unless you think there is something ineffably spiritual in Birmingham city council-tax operations that can only be done by a human.

      1. werdsmith Silver badge

        Re: "Jensen Huang believes"

        I worked at a place a long time ago, they had a PCB assembly department that they called the wire shop. Lots of chattering ladies would sit folding component legs and pushing them through PCB holes, soldering them and trim the excess.

        Then they got a flow solder machine and less ladies. Then they got automatic test equipment and a more ladies that would read the ticket from the ATE and remove the indicated solder bridges, correct the wrong components and put in the missing ones.

        Then they off-shored all the manufacturing. Over the years of reading the Register I've been reading about how all IT work in developed nations was going to be off-shored to developing nations, and quite a lot of companies tried that, and much of it didn't really work out. Then people were concerned about Cloud providers taking all the infrastructure work, but seems that is going through the same cycle as the off-shoring. I expect AI will have to go through that process too, before the reality occurs.

        That company that off-shored its manufacturing is now doing its own manufacturing again, albeit with pick and place machine operators. One of the pick and place ladies is from the original wire shop and is back after some decades.

      2. Flocke Kroes Silver badge

        Re: something ineffably spiritual

        Ignorance outnumbers knowledge in all fields and that is certainly true of programmers. A major contributor to the problem is most employers do not know how to feel the quality - of programmers or software. I do not know the root cause of the post office software disaster: the wrong people, poor description of the problem or lack of time (probably a mixture of all three). I can be certain that crap software outnumbers working software in the training data. That sets my expectations of the quality of the source code output of the modern generation of travesty generators. Apparently some people think they are more productive correcting Copilot output. If ML outputs a significant quantity of valid code I would be nervous about touching it because of the high probability of incoming copyright litigation.

        We already have young humans copying bad code examples until they gain enough experience to know better. I am sure ML can do the first part of that faster - and generate huge quantities of garbage training data for Copilot Version 2.

        When yet another shovel salesman hypes yet another gold rush my enthusiasm rivals Marvin's.

      3. doublelayer Silver badge

        Re: "Jensen Huang believes"

        Throughout the development of tools, we've reduced the number of people required to write any given program, and by doing that, increased the number of people who want programs written. Yes, it will probably happen again, whether LLMs are good enough to do so I'm less sure, but something will eventually start to automate the basic programs that some people have to write. I don't think the people having to write them will mind too much, both because having to write that kind of thing is really boring and because there will be more things that need writing. This is a problem for someone who wants to learn one tool and keep using it forever, but that hasn't worked for programmers for any of the previous years and they should know that it's not going to work now either.

    3. Electronics'R'Us
      Devil

      Re: "Jensen Huang believes"

      Well, I am an EE but I have done a lot of code for embedded systems of my (or my team) design.

      That is an enormous market, incidentally.

      Some of the skills are setting up the underlying hardware [1] which requires a rather solid knowledge of the internals of the microcontroller you happen to be using [2]. I don't see any 'AI' (which although artificial is anything but intelligent) being able to do that in the foreseeable future.

      1. The are hardware abstraction libraries available, just about all of which are poorly written, are very opaque and have horrible corner case problems. Train the model on those and hilarity will ensue. I had a library function (DMA initialisation in one case) that was several call levels deep and I replaced it with 3 lines of code. The same issue exists for just about every onboard peripheral library function provided by the vendor. For precision work, using callback functions is a major no-no but they proliferate in the various HAL libraries.

      2. Modern devices have several thousand pages of documentation without considering the underlying core and assembly language and it can take a long time to fully understand all the functionality and just how to invoke their operation (even order of operations can make a significant difference).

      1. Roland6 Silver badge

        Re: "Jensen Huang believes"

        > I had a library function (DMA initialisation in one case) that was several call levels deep and I replaced it with 3 lines of code.

        When I developed the RTOS for a new computer, with the EE (who designed the motherboard) and component reference documentation we developed a suite of hardware library functions/macros. We knew these weren’t particularly efficient, they did however, capture the knowledge and logic necessary to drive some of the hardware, so we could quickly build code and check its logic flow, with correct logic we could then go back through the code (ASM) and optimise it, doing as you note - replacing library code with the specific instructions necessary. Doing this enabled me to get him to modify his design (*) to better support the RTOS.

        (*) from memory some of the components could be wired differently to support different modes of operation. Plus the hardware defined the ROM/RAM memory map.

  13. Mike 137 Silver badge

    "longer necessary to prioritize computer science and coding education"

    Both?

    'Coding education' is strictly training, and is the most miniscule element of computer science. Computer scientists set the parameters for the design of computers and solutions to computing problems. Coders implement already defined algorithms using specific languages. Both (together with other skills such as algorithm design) are necessary contributors to systems and software development, but one of the fundamental reasons why current mainstream computing technologies are such crap is that almost everyone thinks computer science, software development and coding are all the same thing -- defined by coding.

    Quite apart from which, if neither is to be taught, who will be competent to check that the AI hasn't generated garbage, let alone fix its cock-ups?

    1. Yet Another Anonymous coward Silver badge

      Re: "longer necessary to prioritize computer science and coding education"

      It is probably true that any old CS degree/3 week boot-camp is not going to lead to an automatic highly paid job in the ways that is has (occasionally) been for the last 20years.

      There is still going to be a demand for a different kind of software engineer, probably higher paid and more specialised.

      And there probably should be more exposure to coding in schools - not because 90% of kids are going to be C++ devs, but because everybody is going to be dependent on computers and people shouldn't' think they are magic.

    2. Plest Silver badge
      Happy

      Re: "longer necessary to prioritize computer science and coding education"

      One skill AI can't replace, IT people with 20+ years experience of dealing with panicking, headless-chicken PHBs in P1 calls shouting the odds and demanding full rollbacks on a project that's 27 hours down the track and has some minor issues!

      Yeah AI coding will get you out of that sort of situaiton, not years and years of on the job diplomacy training in handling managment during a P1!

      As a very famous character used to say, "Am I bovvered?!"

  14. Wonderdog

    Shock horror....

    ...company infamous for price gouging when their products reach a dominant/essential market position, encourage society to de-prioritise skills that would decreasee reliance on their product.

    Think I'll pass on that gem of advice Jensen.

  15. Anonymous Coward
    Anonymous Coward

    If you keep training computer scientists then one of them may come up with a better way of doing "AI" that doesn't require our expensive hardware

  16. Evil Auditor Silver badge
    Joke

    Become a toilet manufacturer. As long as there are people, there is crap.

    1. CapeCarl

      Too many coders and not enough....?

      Become a plumber, electrician or carpenter...Its hard to "virtualize" or outsource...And where I live (a giant sandbar sticking out into the Atlantic off Massachusetts called Cape Cod giving me a slight advantage if I were to kayak to from New England to England) we don't have nearly enough people with said skill sets.

      ...Well I was set on being a car mechanic until I learned how to code in 12th grade in 1974.

  17. Youvegottobe Joking

    This crap is for the CIO's

    A company who make lots of billions of money from AI is telling CIO's they need to buy more AI hardware because very soon they can get rid of one of the last remaining bits of their 'IT' workforce.

  18. heyrick Silver badge

    A lot of software is awful and this is without AI. Replace programmers (even poor ones) with AI and it'll be a complete shitshow.

    Which means this prophecy will not happen as any company dumb enough to try will be enough of a warning to the rest that there's no I in AI and that's what the meatsacks are there for.

    1. ecofeco Silver badge

      Seems like a complete shit show now.

      So of course they will make it worse.

  19. cschneid

    Application Development Without Programmers

    A book by James Martin. ISBN-13 9780130389435. Copyright 1982.

    This is a repeat of the wishful thinking behind end-user computing, 4GLs, I-CASE, etc. That someone in the C-suite wants to reduce headcount is not a surprise.

  20. claimed Silver badge

    Wow so in the future most people are farmers, nurses or blacksmiths. There is a rich class who rule over the others, and yes there will need to be some paper pushers but there are plenty of rich sons and daughters to fill those skilled / educated roles.

    Why bother educating the proles, they are happy farming and drinking on the weekends!

    1. Anonymous Coward
      Anonymous Coward

      Why bother educating the proles, they are happy farming and drinking on the weekends!

      After 20 years in IT you have no idea how much I yearn for that life - working at something outside or making things by hand and earning enough to live a simple yet comfortable life from - absolutely, yes please!

      1. Evil Auditor Silver badge

        I fully understand. The problem is that my lardy arse got used to a lifestyle that requires a relatively high income, certainly much higher than what I'd get with making things by hand.

        1. Plest Silver badge
          Thumb Up

          nailed it

          The best justification of why I myself still bust a gut to work 45+ hours a week in my mid 50s doing sysadmin work, instead of taking my camera, my walking boots and buggering off to Scotland to live a simple life just walking and taking photos!

          1. Evil Auditor Silver badge

            Re: nailed it

            One day, Plest, we shall meet while taking photos in Scotland...

    2. ecofeco Silver badge

      Beggars in Spain.

      By Nancy Kress

  21. Anonymous Coward
    Anonymous Coward

    It's a nice story to tell share holders but it's obviously bollocks.

  22. trevorde Silver badge

    Back to the future

    Was told by a 'career coach' that programmers would be obsolete as it would be all done by computers. I ignored him and had a successful, 35 year career, writing code.

  23. Snowy Silver badge
    Terminator

    Robots and AI

    They can do everything, all we need to do is train to consume and reproduce.

    I'm not sure they could also do both of them as well!

    Train to stop the robot uprising?

  24. BPontius

    That will help

    So telling young people not to learn programming or earn a degree in tech or science because AI already has it covered, will help with the talent gap in tech and science how? So more people get degrees in "How to skin a fly", a term a boss of mine used to describe the useless majors people pursue that have little to zero use in the practical world. The lack of understanding of science and tech is the reason we have people that believe the Earth is flat, or that Covid comes from 5G cellular towers, the anti-vaccination crowd. AI is still a baby, barely able starting to crawl, unable to even understand what it is saying. AI WILL be the destruction of humanity if we continue blindingly entrusting our future to it, while we pursue the challenges of making TikTok videos, hunting Pokémon or watching Taylor Swift's every move. INSANITY!!!

    1. fg_swe Bronze badge

      Steve KIRSCH

      He is the inventor of the optical mouse and he analyzed the data:

      Covid vaccination killed 10x more people than it saved.

      Having seen CDC VAERS data I agree with KIRSCH.

      1. Androgynous Cupboard Silver badge

        Re: Steve KIRSCH

        My biologist friend says the vaccine was fine, but the lasers in optical mice are actually death rays and are responsible for the deaths of millions.

      2. Ian Johnston Silver badge

        Re: Steve KIRSCH

        Covid vaccination killed 10x more people than it saved.

        It's at times like this that I wish El Reg had an "Ignore User" button, because otherwise I have to remember just who can't be trusted on anything.

      3. Roland6 Silver badge

        Re: Steve KIRSCH

        > Covid vaccination killed 10x more people than it saved.

        I’m sure a CoViD vaccine developed by an AI would have killed far more people…

  25. trevorde Silver badge

    Better career choice

    'Prompt Scammer' to trick chatbots into giving you free stuff

    https://www.theregister.com/2024/02/15/air_canada_chatbot_fine/

  26. martinusher Silver badge

    Logic seems a bit flawed

    Being able to code is a bit like being able to write in a language such as English. Its an important skill for communicating but just as being able to write English doesn't make you a great (or even mediocre) author being able to code doesn't make you a programmer. As with English relying on generative AI produces very high quality imitative output; there may be some original gems in there but by and large its just rehashing what the machine's seen before (that is, been trained on). Coding, being a restrictive set of language skills compared to a human language, is going to be even more formulaic. Useful, but its really missing the point.

    We may already be at a critical place in coding anyway, be it done by humans or machines. I've been looking at a project's code recently and I noticed that it was primarily built from ready made components -- that's fine, we're supposed to do that -- but at the same time its 'core' seems to be missing. This isn't unusual, though, because I've seen similar assemblies in other projects, so I start to wonder whether this is normal and its me that's out of step. The problem is that if a subsystem just stops working for no apparent reason (invariably resource depletion, of course) there's no easy way to debut this, waiting 72 plus hours for the system to fail and then trying to figure out what's wrong isn't very efficient. Here, no amount of fancy editors, code completion, AI or even 'languages that are guaranteed to be memory safe' will help you -- system correctness becomes a matter of faith. (In real life, at least the life I used to lead before I retired, you had to actively go looking for trouble, there was no point in running the code overnight and hoping for the best.) Anyway, judging by the general behavior of those 'apps' that infest my phone, modern Windows and what-have-you I'm probably in the minority so I'll just retreat to my vacuum tubes (valves, to you lot) and at least keep my fingers warm. But I reckon if we don't deal with this.....

  27. Not That Andrew

    How screwed do you think NVidia is going to be when this AI bubble bursts (my vote is very).

    1. Ken Hagan Gold badge

      NVidia will be fine since their products do a decent enough job. OTOH, anyone who has bet their life savings on the share price never falling ... ... tulips.

      1. Not That Andrew

        You have a very good point. And they'll have a _really_ nice war chest if they don't do anything stupid (not guaranteed).

    2. doublelayer Silver badge

      Not at all. Just as with cryptocurrency before it, Nvidia isn't invested in what the process makes. They're not the gold miners. They're the shovel sellers. Shovel sellers often do well. OpenAI and other LLM farms connected to big tech will be okay. It's the smaller AI startups that will take a hammering, and they may not even need the bubble to burst to have that happen.

  28. fg_swe Bronze badge

    Monetized Nonsense

    It is not as bad as "you will die if you do not take this quick&dirty vaccine"*, but surely the motivation is identical: spin the money wheel for Nvidia, a major producer of ANN processors.

    Current AIs have the complexity of a worm brain - about 10000 neurons.

    Humans have in the order of 100 billion neurons. Even cats and dogs have in the order of 1 billion neurons, which is why we can have a deep, good relationship with them. We see their creativity, their expression of love, their kindness, their acrobatic skills and many more things.

    Dont believe a word of Huang, software engineering can be one of the most demanding intellectual endeavours.

    Just ask the AI to perform a task that it had not inside the training material and it will fail spectacularly. E.g. solve a linear equation system, write the complete Enigma source, compare+judge the V22 and the Do31. It will fail on all of these tasks.

    * very much the opposite was true, statistically speaking

  29. Kevin McMurtrie Silver badge

    What about CEOs?

    Now that's a job title for AI to take over. Sit through meetings, nod, decide what others should work on, and say things that make investors happy.

  30. fg_swe Bronze badge

    AI Nonsense Example

    Please compare all aspects of the do 31 and the v22

    ChatGPT:

    The V-22 Osprey and the Dornier Do 31 are both unique aircraft with VTOL capabilities, but they have notable differences:

    Design and Purpose:

    V-22 Osprey: Designed for military use, it combines the vertical takeoff and landing (VTOL) capabilities of a helicopter with the speed and range of a fixed-wing aircraft.

    Dornier Do 31: Developed as a vertical takeoff and landing (VTOL) transport aircraft primarily for military use, intended to carry troops and equipment.

    Configuration:

    V-22 Osprey: Features tiltrotor technology, with rotors that can tilt from vertical to horizontal for both vertical takeoff and landing and horizontal flight.

    Dornier Do 31: Utilizes a lift jet and lift engines to achieve vertical takeoff and landing, with wing-mounted engines for horizontal flight.

    Performance:

    V-22 Osprey: Faster and has a longer range compared to the Do 31, making it suitable for a wider range of missions.

    Dornier Do 31: Slower and with a shorter range compared to the V-22, but it was designed for specific military transport needs.

    Service History:

    V-22 Osprey: Used by the United States Marine Corps, Air Force, and Navy for various missions including troop transport, medevac, and special operations.

    Dornier Do 31: The Do 31 program was canceled after limited testing and development, so it never entered operational service.

    Technological Legacy:

    V-22 Osprey: Represents a successful implementation of tiltrotor technology and has been in active service for decades.

    Dornier Do 31: Though the Do 31 program was canceled, it contributed to the development of VTOL aircraft technology, particularly in Germany and Europe.

    Overall, while both aircraft aimed to fulfill similar vertical takeoff and landing capabilities, the V-22 Osprey has seen significant operational success and remains in service, while the Dornier Do 31 did not progress beyond the prototype stage

    Me: apparently the quality level changes often. This time it correctly described the main difference. But it has one big factual error(speed), which clearly demonstrates the worm intelligence. The Do 31 is the speedier concept, as it does not have silly huge rotors. Its a real jet, unlike the V-22.

    Also, not a single word about the horrible safety record of the V-22. It has killed more than 40 soldiers in accidents to date. Almost the definition of "pressed into service without reason"

  31. fg_swe Bronze badge

    Fail 2

    Solve the following equation system: y=2x-z, y=10x-z-24, 2y=3x-2-z

    To solve this system of equations, let's start by expressing yy in terms of xx and zz from the first and third equations:

    From the first equation: y=2x−zy=2x−z

    From the third equation: y=3x−z−22y=23x−z−2​

    Now, we can equate these expressions for yy and solve for xx:

    2x−z=3x−z−222x−z=23x−z−2​

    4x−2z=3x−z−24x−2z=3x−z−2

    4x−3x=2z−z−24x−3x=2z−z−2

    x=2z−2x=2z−2

    Now that we have xx in terms of zz, let's substitute this expression into one of the original equations to solve for yy.

    Using the first equation:

    y=2(2z−2)−zy=2(2z−2)−z

    y=4z−4−zy=4z−4−z

    y=3z−4y=3z−4

    Now, we have expressions for both xx and yy in terms of zz. Let's substitute these into the second equation to solve for zz:

    3z−4=10(2z−2)−z−243z−4=10(2z−2)−z−24

    3z−4=20z−20−z−243z−4=20z−20−z−24

    3z−4=19z−443z−4=19z−44

    44−4=19z−3z44−4=19z−3z

    40=16z40=16z

    z=4016z=1640​

    z=2.5z=2.5

    Now that we have found zz, we can substitute it back into our expressions for xx and yy to find their values.

    From x=2z−2x=2z−2:

    x=2(2.5)−2x=2(2.5)−2

    x=5−2x=5−2

    x=3x=3

    From y=3z−4y=3z−4:

    y=3(2.5)−4y=3(2.5)−4

    y=7.5−4y=7.5−4

    y=3.5y=3.5

    So, the solution to the system of equations is x=3x=3, y=3.5y=3.5, and z=2.5z=2.5

    Me: if ypu want to see a proper solution(1,3,5), use wolfram Alpha or implement the Gauss algorithm yourself.

    1. fg_swe Bronze badge

      Re: Fail 2

      I figured it had trouble with formatting and formatted the equations more nicely.

      Result was new nonsense.

    2. Anonymous Coward
      Anonymous Coward

      Re: Fail 2

      Can you please clarify what the alleged point is meant to be here?

      1. Anonymous Coward
        Anonymous Coward

        Re: Fail 2

        Okay, looking at it again, I assume that you had cut-and-pasted how ChatGPT(?) incorrectly "solves" a system of equations. Perhaps you should have made that clearer, though.

  32. MOH

    A man whose massive annual bonus is best on selling gfx cards which are just losing sales from the crypto card overload is now insisting the future is the other overhyped use of said cards. I am shocked

    1. martinusher Silver badge

      The seeds of their downfall are already sown. They can't sell their product to China, they're only allowed to sell a downgraded version. So the Chinese will have to resort to either making their own or coming up with an alternative form of vector processor. (The only thing that's stopping them is our certain knowledge that the Chinese can't invent anything, they have to buy all their advanced stuff from Great White Brother and then steal the designs....)

      They will. When they do there will be a competitor that being a Generation Two model will probably perform better, cost less and use less power.

      Its happened before numerous times. There's no reason to believe it won't happen again.

  33. Tron Silver badge

    AI is already too advanced to get a job in politics.

    The Chinese are dependent on Western tech. Idiot politicians ban them from accessing it. The Chinese work their nuts off to lose that dependency. The Chinese will soon no longer be dependent on Western tech.

    The Western economy is dependent on migrant labour. Idiot politicians repatriate them during the pandemic, lock the doors and do everything they can to block them. Western economies fracture due to loss of competent labour in essential sectors: care home, health, agriculture, hospitality, retail. Economy slides. Strikes. Nothing works. Poverty increases.

    UK economy booming on trade with EU, great trade deals, strong Sterling, low inflation, low interest rates, EU opt outs. Idiot politicians throw it all away with Brexit. UK undevelops. Everything is now crappy. And they wonder why people hate them.

    You clearly have to be really dumb to be a politician, and you cannot make an AI that reliably stupid. So instead of becoming coders, become politicians, pay yourself loads and wreck the future, the way the current crop have wrecked the present. We will always need politicians to ruin our lives.

  34. T. F. M. Reader

    Just curious

    Where will all the training sets that nVidia GPUs will pound on to train AI models come from?

    Especially if one wants to do interesting new stuff. Mundane boring old stuff will have been packaged in libraries, I assume. You might imagine an AI "copilot" that will offer useful documentation to use the libraries, but that doesn't need to be super-intelligent or require too many GPUs to train, nor will it provide much more value on top of actually reading such documentation. Someone, possibly your kids, will still need to do some thinking.

    And for the new stuff - some coding. Enough to make up a decent training set for AI to help with documentation after the interesting new stuff becomes mundane old libraries.

  35. Anonymous Coward
    Anonymous Coward

    Wishful selfish drunk thinking

    Wait and see what will happen when there is no coffers outputs and the AIs have to l"learn" from each other. It will be a nightmare to say the least. AI as currently stands is kids toy that does not even do any real work. As long it is based on statistical convergence and not real declarative inference like human mind it will only be used to solve problems faster not create new things. That is the end of it.

  36. CowHorseFrog Silver badge

    Who gives a rats arse what the scumbags of humanity think ?

    The american ceo is one of the last people anybody should be asking anything from given their considerable examples of sociapath inhumanity.

  37. Roland6 Silver badge

    “This Vulture can't help but think of life on the fictional starship Enterprise“

    Funny that, this reader can’t help but think of life on the fictional starship Heart of Gold and the Nutrimatic Drinks Dispenser when faced with the very simple user requirement: “all I want is a cup of tea”…

    On the other hand, resolving that requirement will require lots of AI processors (probably more than exist today) and a correspondingly large cloud computing bill….

  38. HKmk23

    Ha ha ha

    One word............Skynet

  39. imanidiot Silver badge

    If we don't have anyone writing code, we won't have anyone capable of understanding code. If we have no-one capable of understanding code we lose the ability to check and control the AI written programming.That way lies madness.

    You'll still need people capable of understanding basic programming so that they can go on to learn how to understand complex programming, so that they can then learn how to interpret specs, write documentation, specs, process descriptions, etc and translate these into useful components of programs. I just don't see it happening that LLMs can take over "programming" all that much. It's great at rote repetition of unoriginal thought, it sucks at actually novel invention (or actually, it just doesn't).

  40. Adam Trickett
    Linux

    AI needs to be telepathic first

    I program business software for a living and more than half the battle is trying to get the business to explain what they want. The notion that the business leaders can write their instructions down and an AI could write anything useful is a challenge at the moment. We constantly have to ask questions to get them to articulate what they really want, and they constantly ask for the impossible...

    Eventually AI may replace most day-to-day coding, but not for a few years, and even then skilled people who know code will still be required to convert requirements into instructions for the AI to generate usable code.

  41. Reginald O.
    Meh

    Garbage in, garbage out?

    My limited tests of ChatGpt have resulted in a lot of well articulated garbage responses. Yet, the Wizards say the sky is falling, AI will devour us.

    What am I missing?

  42. MAF

    A little test

    OK Jensen - we propose letting an AI run the Nuclear power plant near you (What could go wrong)

    When you need a medical procedure to be performed on you, the Doctor will use an AI

    You are not driving or flying anywhere without an AI at the controls

    Surely no objections there. You must be willing and able to put your health, wealth and security

    in the hands of this espoused AI utopia?

    No? Then stop spouting this drivel....

    1. Jeff Jones

      Re: A little test

      "we propose letting an AI run the Nuclear power plant near you (What could go wrong)"

      I operated nuclear power plants (as a licensed, trained operator) from the age of 19 to 21. I wouldn't let AI/ML anywhere near the operation of the plant, whether reactor or auxiliaries.

  43. Alan Bourke

    From our 'Grifters Getting Utterly Carried Away By Nonexistent AI' department

    See you all back here in 10 years for a good laugh at this

  44. Allonymous Coward
    IT Angle

    Irrespective of the (in)correctness of Huang’s comments, I’d suggest that science, manufacturing or farming - and plenty else - have always been just as important as IT, if not more so.

    In some ways they’re probably far more rewarding fields to work in, too.

  45. PinchOfSalt

    Given all the self-help (self congratulatory) books on being a CEO, surely this job could be done by an AI by now.

    More than enough training guff in those tomes to feed the thing and then unleash it onto the stock market.

    Perhaps Jensen would like to be the first turkey to vote for Christmas now he's invested Santa Claus?

    1. CowHorseFrog Silver badge

      What exactly does a CEO do except make stupid pointless social media one liners ?

  46. anthonyhegedus Silver badge

    Quality of code

    Sure, an AI can come up with some code, but will it be good code, making the most of resources available? Or will we just use ever more resources because it's cheaper to throw more resources at it than to hire someone to write it properly in the first place?

    I think good coding skills will become rarer and less relevant, except we'd be wrong, because they're always relevant. AI will make mediocre the new norm. Like it's becoming already with things like customer service, build quality of goods etc.

    The next hundred years or so is going to prove all the scifi writers right, I fear.

  47. mpi Silver badge

    Sure thing. All that's needed now...

    ...is an end user, who can write a comprehensive spec outlining what the program should, and shouldn't do, covering everything from error handling, over UX, to security.

    And do you know what we call the format of a spec that fulfills these requirements?

    Code. It's called code.

    1. theOtherJT Silver badge

      Re: Sure thing. All that's needed now...

      Code. It's called code.

  48. John H Woods Silver badge

    I've struggled for a long time ...

    ... to come up with a simple paragraph outlining both the capabilities and deficiencies of current generation AI. And yet, such a paragraph has already been created by AI...

    Somebody on Threads shared this AI-generated (from customer reviews) Amazon doozy:

    =====8<=====8<=====8<=====8<=====8<=====8<=====8<=====8<====

    ✰✰✰✰✰ 3 out of 5

    28 global ratings

    Customers say

    Customers like the appearance of the planter, mentioning it's beautiful and has a nice hanging pots. They like that the pots are weathered and have a sizeable chunk missing from the lip. They also like that it has shattered to bits and is a waste of money.

    =====8<=====8<=====8<=====8<=====8<=====8<=====8<=====8<====

    LLMs are great at recombining tokens But they really don't have any mechanism for knowing what anything actually means.

  49. DJO Silver badge

    Not in my lifetime

    I use Visual Studio 2022 with the super-duper souped up intellisence or copilot (or whatever they call it) and while it does make some helpful suggestions it also make really stupid ones. I've yet to decide if the time saved by the good ones exceeds the time spent undoing the stupid ones. Most of the helpful ones were what I was going to enter anyway so all it saves is half a line of typing.

    AI can do the simple stuff but is there really a demand for a million different "Hello World" applications? As soon as you start doing something really complicated the AI is worse than useless.

    Also it's kind of important not to allow a skills vacuum to form when the current programmers start to die off.

  50. Jeff Jones

    Fact or Fiction?

    AI/ML is in no danger of replacing software engineers. At best, it is capable of generating simple code, not design and the code to go with it that is efficient, scalable, supportable, etc. I use AI/ML for simple, repetitive tasks in my coding, usually having to correct the generated code, or ignore it altogether as being worthless. I started writing AI/ML with MS's ML.NET years ago, and see a lot of good uses of AI/ML, as long as the software engineer sticks to reality and not science fiction and fantasy.

    AI/ML is not capable of thinking or reasoning alone. AI/ML can only do what it has been trained to do, which depend on humans with very large training data, which makes the AI/ML generative output limited to how good the training data is (which depends on how knowledgeable the humans that made the training data).

    I see the future of people working in software engineering to be one where coders, hackers, "script kiddies", etc. will be largely (but not totally) eft out or replaced, but those who approach software engineering holistically with engineering and business acumen, and understanding users, able to think creatively and with deductive reasoning, to do well and have long careers.

    AI/ML is a good tool (of many tools) for such software engineers but can never replace them. AI/ML replacing software engineers is pure science fiction.

    1. DJO Silver badge

      Re: Fact or Fiction?

      I use AI/ML for simple, repetitive tasks in my coding, usually having to correct the generated code

      Is the time you spend checking the generated code greater or lesser than the time it would take you to write the same from scratch?

      I suspect in most cases it takes longer to check than to write it yourself and anybody who uses generated code without checking it thoroughly is going to have unpredictable and crashy programs. I'm sure AI/ML will get better but for programming critical software from design to publish, it's unlikely to ever be better than a meatsack.

  51. Ajlogo
    Linux

    I suggest that Programming will morph into Prompt Engineering.

    A Conversation with the Founder of NVIDIA: Who Will Shape the Future of AI?

    This is the title of his talk viewable on youtube.

    Higher Performance in a Higher Level Language this is what I think he is saying.

    I suggest that Programming will morph into Prompt Engineering.

    Prompt Engineering is simply knowing the right questions to ask.

    Learning what the right questions are becomes an objective.

    This can take time.

    Many have tried a GPT and successfully gotten the rotten result they where looking for proving garbage in is garbage out.

    To those folks I request that you try again with an open and curious mind.

    My experience is Prompt Engineering is the future of programming.

  52. Rob 63

    Given that we need multiple months and iterations to tease out requirements that were never stated at the outset, I very much look forward to this process that will understand the clients vague poorly thought out 'requirements', and then perform the miracle of generating usable code and deploying in the correct place without any issues! Doesn't sound like a pipe dream at all, just like my various 'smart home assistants' do everything I ask of them! Siri play that song that i've listened to a thousand times, 'Im sorry I couldn't find x by y in your library' *manually finds and plays song via phone* siri what's this called ? 'this is x by y' , ah yes, so the exact thing you just said doesn't exist then

  53. constance szeflinski

    And Tape is Dead

    The world has been declaring the death of tape storage almost since I entered the world of data storage in the 70s. As we all know, tape is alive and well. Declaring the death of coding is even more ridiculous.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like