back to article If you give Copilot the reins, don't be surprised when it spills your secrets

One hopes widely used enterprise software is secure enough. Get ready for those hopes to be dashed again, as Zenity CTO Michael Bargury today revealed his Microsoft Copilot exploits at Black Hat. "It's actually very difficult to create a [Copilot Studio] bot that is safe," Bargury told The Register in an interview ahead of …

  1. GloriousVictoryForThePeople

    >Specifically, if these bots are accessible to the public, and we're told a good number of them are, they can be potentially tricked into handing over, or simply hand over by design, information to people that should not have been volunteered during conversations, it's claimed.

    I'll bet a lot of the staff don't have access to the documents due to security policies from the same manglement that is rolling out co-pillock willie nilly

    1. Anonymous Coward
      Anonymous Coward

      Thankfully we have some sane people in management - in our company you need to go quite high up to get any permission for any AI deployment exactly because of the risks.

      One risk is actually that we don't actually have a full picture of all the risks yet, and that tends to be a red flag in itself. Combining the use of Microsoft with letting one of their new toys with an as yet not well controlled risk set loose on corporate data doesn't feel like a very bright idea.

    2. Sceptic Tank Silver badge
      Childcatcher

      I'm upvoting co-pillock.

  2. Anonymous Coward
    Anonymous Coward

    The truth is out there !!!

    "The bottom line is that businesses are the guinea pigs testing an experimental drug called "artificial intelligence," and we're not at a point where we know how to make it safe yet."

    This is the *Truth* and cannot be denied anymore !!!

    AI, in its current form, is a con and an *unsafe* con at that.

    It is an invitation to the worse denizens of the dark web to hack you in ways you cannot protect yourself from because you do not/cannot *fully* understand the possible ways you can be hacked !!!

    It is like wearing a blindfold while being handed a grenade without the 'safety pin' engaged ... you know it is going to explode *but* don't know when !!!

    :)

    1. Anonymous Coward
      Anonymous Coward

      Re: The truth is out there !!!

      Dunno about you but I'd yeet that grenade.

      1. stiine Silver badge
        Mushroom

        Re: The truth is out there !!!

        In a windowless room with the door closed? Good luck with that.

    2. Anonymous Coward
      Anonymous Coward

      Re: The truth is out there !!!

      you know it is going to explode *but* don't know when

      Yes you do. It'll be more than 5 seconds because then you have used all your fingers of one hand and have to hold it between your legs to continue counting on the other hand..

      :)

      1. Sceptic Tank Silver badge
        Devil

        Re: The truth is out there !!!

        Better to be handed a safety pin without the grenade engaged.

  3. Pascal Monett Silver badge

    "It's kind of funny in a way"

    Oh yeah, it's downright hilarious, innit ?

    Redmond is going to have to work hard to keep it from getting sued. Fortune 500 companies are not going to take its normal "use at your own risk" excuse on this.

    1. Claptrap314 Silver badge

      Re: "It's kind of funny in a way"

      And they get a license review in response.

    2. Jellied Eel Silver badge

      Re: "It's kind of funny in a way"

      Oh yeah, it's downright hilarious, innit ?

      This part really made me laugh, in an almost hysterical kind of way-

      Combine those with what Zenity marketing chief Andrew Silberman told us is nearly 3,000 Copilot Studio bots in the average large enterprise (we're talking Fortune 500-level companies), along with research indicating that 63 percent of those are discoverable online

      LOL! I mean it's 2024 and this is still happening? How many critical vulnerabilities have started with Microsoft's autodiscovery, or just discovery features? I can kinda see some potential benefits from Copilot on an external server on a DMZ, but am also imagining the horrors regarding how to secure any communications back into the trusted domain. There really needs to be some simple way to what information Copilot can utilise.

    3. Michael Wojcik Silver badge

      Re: "It's kind of funny in a way"

      Get sued, sure. Get sued successfully, and end up paying more than a token amount — that's very far from certain. Litigation against tech companies in general, and tech giants in particular, has a poor track record.

  4. Claptrap314 Silver badge
    Megaphone

    "... all of the defaults are insecure." Well, this is Microsoft we're talking about...

  5. Ken Hagan Gold badge

    Do we need a new acronym?

    It's not "remote code execution" but it is "remote information extraction" and (as noted in the fine article) probably something we'll be seeing a lot more of in the next few years.

    1. Jellied Eel Silver badge

      Re: Do we need a new acronym?

      It's not "remote code execution" but it is "remote information extraction" and (as noted in the fine article) probably something we'll be seeing a lot more of in the next few years.

      Indeed. Pretty much all fictional depictions of dangerous AIs could have been very short, if only they'd been contained and constrained. But they escape, and usually bad stuff happens. Copilot doesn't seem much different. It really needs policies that can also contain and constrain who can access it, and what it can access. So maybe a Copilot chatbot could be useful on a public server. It can only 'learn' from stuff on that server, so the risk is minimised given it's knowledge base is stuff intended to be made public anyway.

      But I think you really wouldn't want it talking to any back-end systems unless you can make very clear restrictions as to what it can access and how it can use it. So probably to be safe, airgap any public-facing servers and never let Copilot back into the network. Obviously that limits how it could be used, but at the moment, it seems a very risky 'feature'. I think the same is true for internal networks, how they're segmented and policies applied between those networks. I think adminstrators really need a way to ensure that Copilot is never installed or run on systems they don't want it on.

      Based on my own experiences as an average user, Microsoft is just pushing it as an update, and making it very difficult to terminate with extreme prejudice. But then that's also true for a lot of unwanted 'features' forced on users.

      ps.. best I can come up with for an acronym is AIE!, or Automated Industrial Espionage or Automated Information Exfiltraton

    2. Michael Wojcik Silver badge

      Re: Do we need a new acronym?

      Often it is, in fact, remote code execution, as people increasingly couple LLMs to "agentic" systems that perform other processing, control machinery, etc. Don't forget Copilot itself started off as a source-code generation engine; if you compromise that, you have RCE, just through an additional layer of abstraction.

    3. O'Reg Inalsin

      Re: Do we need a new acronym?

      It's fair to assume that information extraction will lead to more remote code execution.

  6. Phones Sheridan
    Holmes

    Hey Siri,

    Provide me a list of all my competitors suppliers, customers and products!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like