So can we write an AI that writes code by Googling Stack Overflow? It can't do worse than Adobe...
'We think autonomous coding is a very real thing' – GitHub CEO imagines a future without programmers
At Pier 70 in San Francisco, California, on Wednesday, where ships once were built, code-hosting biz GitHub held forth on building and shipping code, and on the people who do so. Following a manic Looney Tunes-inspired cartoon intro, Chris Wanstrath, cofounder and departing chief exec, opened the keynote presentation by …
Wednesday 11th October 2017 20:35 GMT Anonymous Coward
Wednesday 11th October 2017 20:45 GMT Adrian 4
Wednesday 11th October 2017 21:49 GMT Tomato42
And we all know how good are the people "up the food chain" at writing precise and unequivocal specifications...
Yes, people copying solutions off of Stack Overflow *ekhm* programming will loose their jobs, just like drivers, lawyers, doctors, but programmers will not be the first people to go.
Either way, it doesn't look like the French revolution will reign much longer as the most bloody one...
Thursday 12th October 2017 02:02 GMT rtb61
Specification vs Instruction
There is a huge difference between specification and instruction. Why you specify, you state what should be achieved, when you instruct, you state how it should be achieved.
Major real world difference. The need for coding will reduce but not be eliminated. Obviously you will need to debug auto generated code, when results fail. Why did it fail, why was you specification wrong, was it an auto code mistake or is your specification in error, check the output, read the code, that's the only way you can find out.
So you will code much faster because the bulk of the coding will be auto generated based upon the specification. You than check the code, and audit the code for security, compactness and speed, which is way faster than trying to check your own mistakes.
This post has been deleted by a moderator
Wednesday 11th October 2017 21:03 GMT Anonymous Coward
Wednesday 11th October 2017 21:03 GMT Paul J Turner
Wednesday 11th October 2017 21:16 GMT Anonymous Coward
Wednesday 11th October 2017 21:17 GMT Anonymous Coward
Aircraft? Never get off the ground...
Until they did.
Each iteration brings us closer. There are now robust no-code business application development platforms chomping their way through tasks that a few years ago needed skilled coders. They won't replace blue sky coding, but there will be a sea change in the next few years. DevOps will be fine, but it's a bleak outlook for grunt-work coding.
This post has been deleted by its author
Thursday 12th October 2017 09:10 GMT Chuunen Baka
I've been coding for 40 years. When I started, everything was built from scratch. Now I glue open source libraries together. You still have to write reams of complex code but the relative increase in productivity is amazing. There are loads of user self-service products of varying qualities but there'll always be a role for specialised computer wranglers. Maybe not as many as now and definitely fewer doing low level coding.
Thursday 12th October 2017 10:35 GMT Martin M
Arguably, integration of higher level libraries and components is *harder* than lower level coding, which is a fairly mechanistic process.
As for roles for specialised computer wranglers, it's notable that as abstractions and high-level reuse have increased, the number of developers has too. Probably because increased productivity means more problems are economic to tackle. Presumably that will end sometime, but we certainly don't seem to be nearing the inflection point yet.
What *may* happen is that there's a shift in skills requirements, with the ability to talk to end users to work together on accurate requirements being a bigger part of the job. Historically that's been mostly a different set of people than the coders (business analysts, UX designers, etc.) but it works a lot better when it's one person doing both that and the coding. People did some amazing stuff to efficiently solve business problems with 4GL languages and small teams back in the day, albeit the systems were difficult to maintain and often based on very proprietary underlying platforms.
Hardcore system-level programmers will still be required to build the lower level platforms, of course.
Thursday 12th October 2017 12:37 GMT John Smith 19
In 1970..software packages that had abstraction levels tailored to business niches.
"The users would then be able to produce their own custom application interactively."
WordBASIC, Excel macro language --> VBA
End user writing code is why COBOL had "PLUS" & "MINUS" in, because "end users" were not thought smart enough to use "+" and "-"
Wednesday 11th October 2017 22:46 GMT Vincent Manis
Well, OK, then
I'm a fan of the Fargo film and TV series, which depict decent people going to hell through a series of really bad decisions. Similarly, programming ended in the early 1960s, when FARGO, the programming system was created. [https://en.m.wikipedia.org/wiki/FARGO_(programming_language)]. Not!
Of course, people don't write mortgage programs anymore, they use spreadsheets. We don't need to keep writing the same programs over and over.
Software development has changed many times over the decades. What hasn't changed is the need to have smart people create solutions to problems that nobody has solved before, and other smart people evolve those to meet changing needs. Intelligent programming assistants can definitely be a part of that.
Wednesday 11th October 2017 23:38 GMT ST
Thursday 12th October 2017 02:42 GMT John Gamble
Re: We already have autonomous code generation
Yeah, that was my first thought as well. But what gets compiled will be an even higher level language.
It's been decades since I've had to write anything in assembler -- compilers got much better, and the instructions that were known cycle hogs have been tamed. On the other end of the scale, I'm assuming my former co-worker who specialized in VAX COBOL has either learned new skills or retired.
I can't be certain what the Next Big Thing will be (although I'm seeing signs that explicitly written loops will be the next thing to become a rarity, as ranges-as-objects become common), but it will certainly come, and we will adapt or retire.
Thursday 12th October 2017 03:32 GMT ST
Re: We already have autonomous code generation
> the instructions that were known cycle hogs have been tamed
Oh I wasn't even referring to an optimizing compiler. Just a simple C compiler that does no optimizations at all.
Yes, someone can write a compiler-compiler -- something that emits code which, when compiled, becomes a compiler. There are known examples of such a thing - Yacc/Bison for example. But it's an infinitely recursive problem: who writes the compiler-compiler, who writes the compiler-compiler-compiler, and so on.
Someone who makes this type of general-purpose grandiose statement - that software will write itself some day - Real Soon Now - has no understanding of the process of bootstrapping a compiler.
Why I keep yammering on and on and on about compilers: because without a compiler, there's no software.
Thursday 12th October 2017 20:53 GMT John Gamble
Re: We already have autonomous code generation
"Oh I wasn't even referring to an optimizing compiler."
Yeah, neither was I. I was referring to machine instructions, which back in the Olden Days one tried not to use in tight loops (using, say, shift & add for multiplying a constant with your variable, or even shift & subtract for the division equivalent). Changes in hardware are just one of the things that also drive changes in software languages.
My point was more toward the fact that what we regard as a compiler will change in the coming decades, because the languages we use will have more features 1 and (one hopes) more safeguards.
1. Julia, in fact has just-do-it operations in it that I could only dream about forty years ago.
Wednesday 11th October 2017 23:59 GMT Denarius
haven't we been here before ?
Ah yes, COBOL meant the PHB class and users write their own code, then SQL meant management could directly query business data followed by the great white hopes of VB and Delphi. At least the last two were not seen or sold AFAIK as anyone can code business solutions.
Now we have a cultural elite and PHB class that despise technical knowledge and experience in the real messy world. This explains why wasting education time on coding instead of history, pure logic, maths and written communications has become an obsession with advisors.
Andrew also touched on the biggest flaw in all of the machine learning lego style work. Who _knows_ what flaws the libraries contain that may cause ML and AI fails ? Was there a recent ElReg article on a comparison of two parallel coding libraries that gave different results ? Lastly, hardware vs software speeds; spot on. I remember 486DX with 16 Mb RAM and Win95C or early RedHat running faster than the 8 core 8Gb machine running the Abomination or Debian I use at moment doing the same tasks. If software gets much slower SteamPunks building Babbage Machines will be doing spaceflight calculations better.
Thursday 12th October 2017 07:15 GMT bombastic bob
Re: haven't we been here before ?
"Ah yes, COBOL meant the PHB class and users write their own code, then SQL meant management could directly query business data followed by the great white hopes of VB and Delphi. "
don't forget "Forest and Trees". that was an interesting thing. It died, like similar things. I think MS Access may have killed it.
And one more point: SKYNET programmed itself, didn't it?
Icon, because, SKYNET mention.
Thursday 12th October 2017 15:31 GMT LDS
Re: haven't we been here before ?
VB and Delphi were named RAD - Rapid Application Development - solutions. They eased the GUI construction removing the need of long pieces of code (the "in Windows you need 100 lines of code to show 'Hello World'! ") and resource files. Sometimes they was sold as being able to build "codeless" applications just using "components" (which are after all libraries, just with specific interfaces) and their properties to link them, but it worked only for very simple ones, good for demos, nothing more. Delphi came with a full, powerful, object-oriented language (and could be used without components as well, being fully able to call any OS APIs directly).
But the emphasis on the "components model" made some executives believe real applications could have been built just using pre-made components with very little glue to link them together. One in a company I worked for tried to advocate a model where only low-skilled developers (those you could probably replace with some AI...) were needed to assemble high level building blocks to deliver a full application - he called them "blue-collar programmers", just needed to tighten bolts (that company went bankrupt a few years after I left).
It was the same time CASE tools were promoted, and even tools that should have allowed programming in a visual way without coding, i.e. Borland ObjectVision. None went far.
Let's see if AI can change it...
Thursday 12th October 2017 00:49 GMT FelixReg
Yep, I remember reading an article in Datamation, the main magazine for data processing types back in the late '50's or so.
The article touted this great new thing, COBOL. COBOL would change the world. COBOL meant future managers could write their own programs rather than relying on pesky programmers!
The article's prediction was correct.
As time went on, "programmers" became indistinguishable from the "managers" of the article writer's imagination. And they wrote programs in COBOL.
Too, what programmer out there has not been in the business of writing a replacement for themselves at some time?
GitHub is, like the Internet, a huge advance in productivity. We're all richer because of them. Good deal.
Thursday 12th October 2017 03:08 GMT Amorous Cowherder
Maybe not for quite a while yet
In 25 years of working with DBs I've yet to find a good SQL code generator. SQL is such a simple language, extremely simple to learn and use, not that difficult to master and yet when you start to go beyond 20 odd lines and use generators to build the code they always come up with the most stupid code that is never efficient. You always end up taking the skeleton generated code and re-writing it or worse jigging the database and data structures to suit it because the code is so complicated to deal with but sort of works. Sometimes people would rather do even more stupid things like duplicate the whole data set into more bizarre forms or start chucking indexes around like wedding confetti killing any hope of efficiency.
When I see a decent code generator for something as simple as SQL that produces a 1500 line explain plan for a 50 table join and it works perfectly, then I'll listen and you can start talking about code generators for more complex languages like C#, Python, Node, C++, etc.
Thursday 12th October 2017 03:44 GMT Teiwaz
Re: Maybe not for quite a while yet
yet when you start to go beyond 20 odd lines and use generators to build the code they always come up with the most stupid code that is never efficient
That's been true since at least dbase 3.
Even if it is moderatly efficient, it's usually totally unreadable - making extension/expansion impossible without a total rewrite.
Thursday 12th October 2017 04:34 GMT rmullen0
The guy is full of it
Thursday 12th October 2017 04:37 GMT rmullen0
Don't forget about security
Let us not forget that the elite corporations that are espousing AI have yet to even write secure software. How about they focus their efforts writing code that isn't completely insecure and riddled with security holes first. As far as I am concerned, they are doing a terrible job.
Thursday 12th October 2017 05:39 GMT Anonymous Coward
the giants' shoulders
Wanstrath suggested that coding, which has been likened to standing on the shoulders of giants, will become more like standing on each other's shoulders.
Some of the code I see looks more like hanging off the testicles of each other trying not to fall into the abyss below.
[Edit] Mind you, that's just my own code. What some of the others are writing does nor bear thinking about.
Thursday 12th October 2017 06:42 GMT Milton
Notwithstanding that AI doesn't exist and machine learning works only in extremely specific cases, I submit that the gulf between "There are oodles of libraries (of widely varying, often questionable quality)" and "Robots will write all the code" is absurdly vast.
And of course, GitHub might have some incentive to hype such an overblown prediction. I think a positively enormous pinch of salt goes with this.
Thursday 12th October 2017 07:26 GMT Steve Channell
Unfortunately Inversion of control containers make true dependency very difficult to find and dynamic scripts (e.g. Monkey patching) almost impossible. But that is not what this is about.. It's about "how f*cked am I when someone delists their library from NPM".. if you need to ask.. You already know.
Thursday 12th October 2017 08:21 GMT bfwebster
Been hearing this for over 40 years
The fallacy that happens again and again is the presumption that the value in coding lies in combining things that already work. It isn't. The value is in inventing new things that haven't been done. Otherwise, you're simply solving the same problem that's already been solved, and there's very little value in that.
A great piece to read is "The Five Orders of Ignorance", an appendix in Philip Armour's book _The Laws of Software Process_.
Thursday 12th October 2017 08:23 GMT tutnes
No silver bullet still holds
I think Fred Brooks No Silver Bullet still holds water:
"there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity."
Thursday 12th October 2017 13:40 GMT handleoclast
Re: No silver bullet still holds
There used to be silver bullets in the past.
* Going from hard-wired (literally, using a wire-wrapping tool) programs to stored-program computers.
* Structured constructs like do loops to replace spaghetti gotos.
Etc. These things provided major improvements in programmer productivity and code reliability.
These days we just get a load of EPNS bullets. They look like silver. They're touted like silver. But years after, when all the hype has died away, they are quietly forgotten. Another management fad that died the death it truly deserved.
Occasionally we have a turd wrapped in cooking foil pretending to be a silver bullet. Pascal comes to mind, here, as does any language in which Niklaus Wirth had any involvement. Ob Wirth Joke: how you pronounce "Wirth" depends on whether you call him by name or by value - it's either "weerth" or "complete fuckwit who should never be let near the design of a programming language."
That's not to say that there are no more silver bullets waiting to be found. There may be. But going by the last few decades there probably aren't.
Thursday 12th October 2017 08:26 GMT Aging Hippy
Thursday 12th October 2017 19:43 GMT Bucky 2
My careers advisor told me the same thing, but in the 70's. I believed him. I wasted years trying to stay away from computer science. Eventually, I gave up trying to do anything "real" and did indeed become a programmer.
I frequently wonder how my life would have worked out differently, if the industry hadn't been telling schoolkids it didn't need them anymore.
Thursday 12th October 2017 08:29 GMT tiggity
I did play with CASE tools that went all the way from model to generating a code skeleton from that model.
But, to be useful, the model had to be in so much detail that it took far longer than if just writing the code by hand as model had to really be every i dotted, t crossed level (lots of stuff that's normally implicit had to be explicitly detailed) (and the skeleton still needed fleshing out with some hand crafted code modifications).
.. and then the fun of when model changes needed ...
So, CASE tools interesting but still have a long way to go.
Saturday 14th October 2017 20:51 GMT acid andy
You're right about CASE tools. It takes at least as long as writing the equivalent code would have done and then, more to the point, it's a whole lot less fun than just writing the code. Luckily, with most tools you can knock out the basic code first, click a button and have it generate the CASE model from that afterwards if your PHB insists on having one.
Thursday 12th October 2017 10:54 GMT EnviableOne
Thursday 12th October 2017 12:59 GMT John Smith 19
"writing less code" blah blah "libraries, frameworks" blah blah asymptotic curve--> end of history.
When I put it like that does it not smell like odour du taureau ?
IOW the rate at which the amount of code being written by meatsacks may be falling (but then isn't Excel macros the most used programming "language" on the planet by a very wide margin?) but the rate at which software "intelligence" (WTF that even means) is improving is slowing down as the problems to be tackled when facing large systems become exponentially harder.
Low "intelligence" --> "Pick n mix" code generator
High "intelligence" --> "Write a new system called 'Universal Credit.' Project brief is in 'projects' folder under 'UC'* Come back to me with outstanding questions and outline architectural structure. Your choice on best development languages for the different environments and benchmark data to deliver 2-3 second response times for major tasks.."
Anyone like to guess how many problems are in those 4 lines,how hard they are and how far away any real system is from dealing with them?
BTW it's interesting that decades of code studies show programmers actually writing executable stuff remains in the 12-20 LOC a day. So, yes size does matter.
*I chose Universal Credit because that's a real system being built (because according to El Reg passim it's not there yet) by real people on a serious scale for serious money 9with serious cost and schedule overruns).
Thursday 12th October 2017 13:01 GMT ocratato
More than Code
Building software of any complexity requires a design process. It may be a huge set of documentation, or a few notes on a whiteboard or some ideas floating around in someone's head - but it always exists, and in my experience it is rarely visible on GitHub or what ever repository the code is placed on.
An attempt to use deep learning AI to create an AI-programmer from the GitHub repository will not produce something that can take a set of requirements and produce a program.
Before we can use AI to do the work, we need to build repositories of designs for the AI to study.
Thursday 12th October 2017 13:34 GMT BarryUK
And the hype continues...
"People write less code now than they once did due because so much functionality comes from libraries"
Is this true though? Where once people might have written a simple, standalone program they now write a full blown Enterprise Architecture based solution for about the same effort, because it's so simple to use frameworks like Spring, Hibernate, etc. Does this mean people are actually writing that much less code?
I remember 30 years ago my manager told me programming would soon be obsolete thanks to CASE tools. 15 years ago Scott McNealy no less, told me the same thing about OO. I reckon programming will see me through to retirement in another 15 years or so without any trouble.
This post has been deleted by its author
Friday 13th October 2017 01:07 GMT Fruit and Nutcase
Don't forget the Code Reviewers. A bit like Cockroaches surviving a nuclear holocaust, a particular type of code reviewing meatsack will survive coding armageddon in order to ensure the code generated by the AI programmers is formatted to the letter of the style guide. They may not understand the code in front of them to spot problems, but they sure will spot formatting that is inconsistent with the style guide.