Re: How much time saved?
In the end, LLM-written code is about as good as would be produced by some third world outsourcing shop
That may be true, which is why they're not using the LLM to write code. They're using it to modify existing legacy code. Two of the examples they gave are migrating Java 8 applications to Java 17, and migrating JUnit 3 tests to JUnit 4.
Unlike writing all-new code, these are tasks that are much better defined, much easier to train a model to perform... and much, much more tedious. They are high-effort, low-perceived-reward tasks. (Especially when success is defined as there being no discernible effect on the software itself. A successful migration is one where nobody can tell you've done it, because the post-migration code works exactly the same as before... except, you've sunk thousands of person-hours into making internal updates.)
Scouring hundreds of thousands of lines of code looking for deprecated or changed APIs, then updating the code to newer interfaces, is such a tedious job that many shops end up deciding to never do it at all. Legacy code is just kept limping along in legacy systems as long as possible, until it's eventually replaced with all-new code written to newer standards.
(That all-new code is very frequently a bare-bones affair. It eliminates deprecated APIs and other outdated internals, yes, but also strips away many of the features of the older codebase that — due to time, money, or design constraints — don't get reimplemented in the first versions of the new, replacement system. That's how you end up with software releases that hemorrhage functionality compared to their previous versions, because the "totally rewritten!" new codebase threw away everything that came before — including a ton of the features.)
If Google can use AI-assisted processes to avoid some of that write-bandage-discard loop, and break the ground-up rewrite cycle, I'm all for it.