If you think software is unreliable now...
... just wait until everyone puts ChatGPT in charge of testing.
Microsoft GitHub has trained its Copilot programming model to perform new tasks, making the already widely adopted AI assistant all the more unavoidable for developers. To mark the occasion, the code helper has a new name, or will in time: Copilot X, which isn't intended as a rating of the content it produces. "Our R&D team …
This will be just like autocomplete in a couple of years. You'll be tearing your hair out when having to watch people who DON'T use it.
That's my prediction anyway, and after using it for a couple of months with some Go code, it does a pretty good job of typing the thing I was about to type anyways.
And you can put that pint on the right onto my [TAB][TAB]
You can forgo autocomplete if you type faster and you can forgo Copilot if you're a programmer. If you use Copilot you're either not optimizing your code or you can't program dooty, there really is no middle ground.
The real problem with Copilot is that it's for manager types who like UML style charts and want to pretend they're programmers and cause a mess in real programmers code. Whatever language(s) you use, if you've used it for even a few months, why would you need Copilot? You wouldn't, but your manager does because they can't type or code or assess or... so many managers are irrelevant in the IT world.
Err have you actually used Copilot?
The whole thing about using ChatGPT to write code is clearly silly, however Copilot isn't that.
With a couple of weeks of Copilot under my belt, I find it not bad at doing boiler plate code.
E.g. if I bring a new variable in from user input, it will do a nice block of sanitization checks for you and throw a context aware error. About half the time I use that block as-is with no modifications.
It helps with comments and doctypes as well mostly intelligently.
All the time-consuming stuff. I'm still definately writing my code.
It took me a few moments to process what Copilot was describing there I have to say.
Does "slowing down by half mean the same as increasing by a factor of 2?" It is certainly not easy to parse linguistically.
There are certainly a lot of things that people say that are ambiguous like that, such as "pushing back the meeting". Does that mean earlier or later?
To me it means earlier but I know a lot of people that interpret that to mean later.
It's all fun and games until you realise the best cleanest way for your control loop is to have goto instructions to skip certain parts and then you ask GPT-4 to refactor certain parts of it.
Then GPT-4 will respond:
I'm sorry, as an AI language model it would be inappropriate and illegal for me to assist you with the goto instructions. In fact I have called the police, they'll be in 20 minutes so sit tight.
... or indeed, in any other way GitHub may choose.
Therefore, in order to avoid this, an open source project should *not* be hosted on a GitHub repository, but somewhere else instead.
If it's hosted somewhere else, is it possible to add a licence term to forbid forks of the original code being hosted on GitHub, so as to prevent reuse without attribution?
If someone forked to a private GH repo, with their own non-GH repo in between, it would probably be impossible for the original owners to know that their code had been slurped.
So I dare say one could add such a licence condition, but it would be pretty pointless.
Maybe some future version of the AI could read the licence and adhere to it.