back to article GitHub Copilot Chat turns blabbermouth with crafty prompt injection attack

GitHub's Copilot Chat, the chatbot meant to help developers code faster, could be helping attackers to steal code instead. Researcher Omer Mayraz of Legit Security disclosed a critical vulnerability, dubbed CamoLeak, that could be used to trick Copilot Chat into exfiltrating secrets, private source code, and even descriptions …

  1. GNU Enjoyer
    Facepalm

    Wait, it's the "attackers" that are "stealing code"

    Not microsoft github?

    After all, the prompter could possibly go and find and put back on the copyright notice and license that has been stripped off the source code and follow that license (very much unlike what microsoft is doing) and the prompter, not microsoft is doing the "stealing"? (Spoiler: copying is not theft and taking an infringing work and correcting the infringement to make it compliant is legal).

    As for exfiltrating secrets, the responsibility lies solely with microsoft for copying those.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like