The Register Home Page

back to article How AI could eat itself: Competitors can probe models to steal their secrets and clone them

Two of the world's biggest AI companies, Google and OpenAI, both warned this week that competitors including China's DeepSeek are probing their models to steal the underlying reasoning, and then copy these capabilities in their own AI systems. "This is coming from threat actors throughout the globe," Google Threat Intelligence …

  1. Anonymous Coward
    Anonymous Coward

    Professors are trying to steal knowledge from freshman students? Yeah, right.

    1. Philo T Farnsworth Silver badge

      You might call it. . .

      . . . A circle of jerks.

      1. Gavsky

        Re: You might call it. . .

        Indeed, or a 'Rotisserie of Tw*ts'...

      2. frankvw Silver badge

        Re: You might call it. . .

        I have to admit that the idea of two AI's "probing each other" did give rise to some highly unusual mental images...

    2. MonkeyJuice Silver badge
      Devil

      Oh no. Someone is stealing output from their brain-damaged LLM based on stolen training data.

      Now there are two brain-damaged LLMs. Oh no.

      [continues about day]

    3. Mage Silver badge
      Coffee/keyboard

      trying to steal knowledge

      What underlying reasoning?

    4. Elongated Muskrat Silver badge

      You joke, but it's not exactly uncommon practice for university professors to claim the credit for work done by their postgraduates and postdoctorates, especially when it comes to publishing academic papers. This counts double if the person who did the actual work was female. It's a genuine problem in academia.

      1. Yorick Hunt Silver badge

        "Those who can, do.

        Those who can't, teach.

        Those who can't teach, teach teachers to teach."

        Your point is sadly all too true.

  2. b0llchit Silver badge
    WTF?

    Pot, kettle, all black

    • 1) Humans create vast amount of works and knowledge
    • 2) AI companies steal the collective works and knowledge of humans to put into an AI
    • 3) Humans steal the collective knowledge of AI to feed into another AI

    And #3 is somehow worse than #2?

    1. thames Silver badge

      Re: Pot, kettle, all black

      The entire Internet is having to implement measures to try to stop the American AI companies from stealing all their stuff and grinding their servers into the dust in the process, but that's OK.

      But if someone does the same to them, then that is bad, really, really bad and it must be stopped.

      I have zero sympathy for them.

      1. Doctor Syntax Silver badge

        Re: Pot, kettle, all black

        I have negative sympathy for them.

        1. Gavsky

          Re: Pot, kettle, all black

          I see your negative sympathy & raise you Absolute Zero sympathy...

          1. hedgie Bronze badge

            Re: Pot, kettle, all black

            My sympathy is an imaginary number.

    2. Rikki Tikki

      Re: Pot, kettle, all black

      Upvoted, but a reminder that IP theft used to be the "American Way".

      The 18th and 19th centuries saw large scale use of European industrial technologies to develop the US economy. Now, they don't like it when other countries do the same to them.

      So , totally agree with the "pot, kettle" title.

      1. Long John Silver Silver badge
        Pirate

        Re: Pot, kettle, all black

        Yes, but one cannot 'steal ideas' in the Biblical sense of nicking oxen, asses, and wives. An idea and its potential for application, enlightenment, or amusement, is not diminished by wide usage; putting it as William James might have said - "The 'cash-value' of an idea bears no relationship to scarcity." Conversely, the ubiquity of an idea is no guarantor of its validity.

        What can cause damage is wrongfully claiming to be the originator of an idea. It's not just a matter of kudos. It bears upon recognition and reputation, each of which aid gaining patronage to support more originality.

        1. Anonymous Coward
          Anonymous Coward

          Re: Pot, kettle, all black

          Agreed ... with the consideration that bad ideas (eg. false news) can spread faster than good ones (eg. truth), and that the genAI babble mills promoted these days can right worsen that state of affairs by producing a continuous industrial source stream of such loony-frosted fruitcake notions.

          Wide propagation and usage of good ideas is a net positive. Damage can come from both wrongful attribution claims and wanton spreading of complete nonsense (outside of artistry), imho.

        2. Paul Hovnanian Silver badge

          Re: Pot, kettle, all black

          "The 'cash-value' of an idea bears no relationship to scarcity."

          Look up "rent seeking".

    3. Gavsky

      Re: Pot, kettle, all black

      "But, BUT - it's not fair! We're allowed to do it, but nobody else should!". 'Eff 'em - I'm laughing...

    4. Anonymous Coward
      Anonymous Coward

      Re: Pot, kettle, all black

      Does usa sue Europe and China for stealing Orville and Wilbur's flying machine?

  3. takno

    "Pirates stole my pirate ship. Pls send the navy"

    1. QET
      Pirate

      And now I remembered that most countries with a large naval force, all once upon a time legalized literal piracy towards nations they considered hostile.

      Except the "legal pirates" were called privateers or something like that.

      1. Gavsky

        Absolutely so, but when you're very powerful & have a Big Stick - you make the rules. Big Sticks are less popular these days, so substitute 'money'.

    2. Bebu sa Ware Silver badge
      Pirate

      "Pirates stole my pirate ship"

      I recall Captain Pugwash sufferred this indignity on several occasions but I don't recall the British Navy's assistance was ever requested. In the few episodes where the British West Indies Administration and Navy appeared they were more bumbling than Pugwash himself (and his crew.)

  4. Empire of the Pussycat Silver badge

    "American-led, democratic AI."

    There's absolutely nothing 'democratic' about AI.

    1. segfault188

      Re: "American-led, democratic AI."

      There's absolutely nothing 'democratic' about AI.

      There's absolutely nothing 'democratic' about America (Trumpistan) or AI

  5. druck Silver badge
    Go

    They are stealing the stuff we stole

    I thought the term frontier model was nonsense, but it's quite apt, as just like the frontier of the wild west, everything the first settlers gained was stolen from the native Americans, and then stolen again by the next wave of gun slingers.

    Hopefully if all those billions invested in training models can be ripped off for peanuts, the AI bubble will burst even faster.

    1. Michael Strorm Silver badge

      > "They are stealing the stuff we stole"

      Yo dawg, etc.

  6. steelpillow Silver badge
    Alert

    This is such an important technology

    it cannot be left to proprietary capitalism and the fight for monopoly lock-in

    1. MonkeyJuice Silver badge

      Re: This is such an important technology

      Except the LLM part isn't. The LLM part is absolutely unimportant, it just commands a high price. Reminds me of crypto. Imagine how far we could have come if our species wasn't fundamentally stupid.

  7. cd Silver badge

    "Threat actors"...

    1. Gavsky

      Only a little better than "Supporting Role Bastards" [no lines]

  8. Camilla Smythe

    Projection

    Wub Wub Wub.

  9. Blue Screen of Bleurgh

    AI is slowly eating itself

    Rejoice!

  10. paluster
    Alert

    Being serious for a moment

    Tempting though it is to laugh when the AI children start crying about how mean people are being...a serious question about the fear mongering from the bloke at Google.

    If companies, especially in the financial sector, are traing models on "internal, sensitive data" why would they deploy them where a hostile distillation attack was possible? Are US companies really thay stupid?

    1. Like a badger Silver badge

      Re: Being serious for a moment

      Are US companies really thay stupid?

      Based on the behaviour we observe there can only be one response. In their haste to seize the moment, all possible and potential risks have been overlooked. Sometimes it's because the board (and the people it actually listens to) are happily ignorant of the various possible risks, sometimes they have simply rationalised away the risks because of FOMO. I suspect most of the time it's both.

      Imagine being a competent CIO of a US company these days. You'd be AI-sceptic, willing to take certain known risks experimentally. But all of the board want the company to leap headlong into the AI future, becoming an early adopter, seizing the vast, vast rewards. If you won't do it, they'll find somebody who will. Assuming you do what's right (at the time) for your career....then if you're around when it all goes horribly, horribly wrong, whose arse will be hung out on the line?

      1. Anonymous Coward
        Anonymous Coward

        Re: Being serious for a moment

        It's like they're drunk, high, experiencing shared hallucinatory states of altered consciousness, that fogs whatever modicum of judgment their minds might have formerly managed to exercise, if any at all. They're engaged in a collective departure from rationality fueled by cognitive biases, emotional extremes, and social dynamics, that feeds on greed, overconfidence, herd behavior, and a huge dose of FOMO performance anxiety imho.

        Nuttin' but bog standard ISO 45003 certifiable batshit crazyness however, entirely treatable by a strict regimen of approved straitjackets, associated leather restraints, anxiolytics, tranquilizers, sedatives, muscle relaxants, cyclohexanone-derived general anesthetics with analgesic and hallucinogenic properties, semi-synthetic opiate analgesics (piperidine or not), and what have you, including the likes of first-class composable delimited continuations ... plenty of choices for the 'temporarily' indisposed!

        On the other hand, there's too few enforceable patents on booze so forget that, and until we can offer a subscription model for ElectroConvulsive Therapy as-a-Service (ECTaaS) that too (and lobotomy) remains out of favor. Like it or not, this Lite-Brite AI (so-called) future (even more so-called) can't be played without a full set of approved proprietary pegs (whatever that means!)! </wut?!> ;););)

        1. Bebu sa Ware Silver badge
          Coat

          "subscription model for ElectroConvulsive Therapy as-a-Service (ECTaaS)"

          Zuckerberg could reengineer his wildly successful Metaverse artificial reality headwear for ECTaaS.

          Should be a big hit with the majority of the users of his social media platform who are clearly in need of a factory reset in the head department although a prefrontal leucotomy might be preferable but the required robotic tech for that might be more in line with Space Karen's portfollo and social media users - a merger of xAI, Neuralink and SpaceX.

          † Uncomfortably close to John Lumic's Cybus Industries' surgical human "upgrade" robots.

  11. Long John Silver Silver badge
    Pirate

    Tears rolling down one's cheeks?

    Distress or laughter?

    1. Gavsky

      Re: Tears rolling down one's cheeks?

      Oh, trust me: it's laughter all day long...

  12. TaabuTheCat

    Parasites

    When you consider the end goal of all these parasites - Google, Microsoft, Anthropic, etc. - is to put all knowledge behind a paywall, then the "theft" being reported needs to continue full steam ahead.

    Call me cynical, but I can easily see a future when there are no search engines, no open web, only "Ticketmaster/Venue" exclusive deals where anyone creating new knowledge cuts a deal with their AI partner to sell them what would been openly available on their website. And if you want that information, you'll have to pay one of these fucking parasite middlemen to get it.

  13. vogon00

    So what?

    Don't know how anyone can act surprised about this. If stuff is published openly on the web, you can expect it to be used....which has been the situation ever since Berners-Lee's brainchild became commercialised (the same is true for the other ways info transits the internet, but you get the idea).

    What this info gets used for is the interesting bit, and depends on your POV. Use can be for good or for bad, but who decides on the goodness or badness of any use case? Back in the day, for example, Napster were exploiters to some, heroes to others and disruptors to all.

    Anything as 'hot' as AI will always have people looking to use it to get an 'edge' somehow, either for sales or to improve your own product or service somehow. I don't see how the AI companies can be surprised, as their own scraping activities have shown that any info published - irrespective of copyright etc - is fair game.

    Still, you have to either admire their brass neck or wonder how stupid they are!

  14. Gavsky

    On dear, how sad - never mind. The AI models trained on lots of copyright material are now being probed by other AI models, allegedly stealing from them. So, sue...or, just go away.

  15. T. F. M. Reader

    Baffled

    If I understand the article correctly, the "threat" of "distillation" is that I can ask Gemini or ChatGPT or Claude something - maybe many somethings - and infer enough information from the answers about the model's secret sauce[*] that can be then used in my own LLM to make it competitive with the original. This does not tell me what the hell it is all about, really. Can someone, El Reg included, provide an illustrative example or two of what such "secret sauce" may be and how can useful details be uncovered though normal use? Alternatively, I'd appreciate an explanation of why my understanding is incorrect.

    [*] "Distilling the secret sauce" does make some sense when one thinks of a distillery, I admit.

    1. sarusa Silver badge
      Devil

      Re: Baffled

      There's no specific secret sauce, just money and the model weights that produced.

      Basically, the Big Evil American Corporations have spent hundreds of billions of dollars stealing zetabytes of data and training their models with it. This is a very laborious process which takes hundreds of millions of dollars, shiteloads of GPUs, shiteloads of power, running for months for a single run. They map out how all the tokens relate to each other in thousands dimensional space, like 'car' appears pretty much in the same location in this space as 'automobile'. 'Kiwi' is a long ways from those but is close to both 'fruit' and 'New Zealand'. But then 'Ford' is close to both 'car' and 'unreliable piece of shite' (I know, I know, that's not actually a single token). Then they compress this a lot, like saying okay you can treat 'automobile' and 'car' as the same thing, because you can't distribute or efficiently run something that needs trillions of numbers. What comes out at the end, after months of processing and hundreds of millions of dollars, is their model, like ChatGPT 4.1.

      Well, then DeepSeek comes along and just queries the shite out of ChatGPT from tens of thousands of separate PCs, pretending to be different actual people, but the goal is basically 'what are your final weights?' . And they were able to do this for $300K-6M (they have been deliberately cagey, because they pretended they weren't just stealing from other models, but the $6M is probably inflated to make 'we weren't stealing!' seem less likely).

      Honestly, I'm fine with this. 'Oh noes, you stole all the stuff I stole from the peasants, you [racial slur] miscreants!' Well Boo Effing Hoo. You wouldn't have DeepSeek if the US companies hadn't spent billions of dollars stealing that stuff first. But then it's absolutely fine for someone else to liberate that and distribute it.

  16. Bebu sa Ware Silver badge
    Coat

    "American-led, democratic AI."

    "Him funny." Vintage Alltman I suspect. "Loose lips sink mips."

    If it is cheaper to interrogate competitors' models than to train etc your own then, perhaps those competitors aren't aren't charging an economic fee for the service and are effectively "dumping" (in the WTO/GATT sense) their AI into the market to discourage new entrants. (Anti·competitive measure.)

    Democratic is an odd word to apply to AI — presumably meant popular or accessible.

    Anyway it would be "republican AI" surely ? And American .45 lead ?

  17. Ian Johnston Silver badge

    "Reasoning ability" my arse.

  18. Blackjack Silver badge
    Happy

    Say the companies that copy everything online to feed their own AI models.

  19. DS999 Silver badge
    Facepalm

    So AI people think stealing is fine

    So long as it is stealing copyrighted data in Library of Congress sized units to train their models. But if anyone tries to pick and poke at their models to figure out how they work, that's criminal behavior and Something Needs To Be Done, like immediately!

    1. Ropewash Silver badge

      Re: So AI people think stealing is fine

      Yep. It was all fun and games taking the artists and authors out behind the woodshed for profit, but reverse engineering their mechanical turk is blasphemy.

  20. Legb

    Garbage in garbage out comes to mind!

    If AI is artificial intelligence, when two artificial brains combine will more people believe in the resulting chaos?

  21. mark l 2 Silver badge

    It's going to take an "ecosystem security" approach to protect against distillation, and this will require some US government assistance, OpenAI says. "It is not enough for any one lab to harden its protection because adversaries will simply default to the least protected provider," according to the memo.

    I'm not sure exactly what OpenAi and Google think the government can do to stop other companies probing their models if they can't stop them at a technical level? Or are they seeking more threats of tariffs from Trump against China if the Chinese gov doesn't stop Deepseek and others from stealing the content that American companies have worked hard to steal that data for themselves.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon