back to article Lawyers say US cybersecurity law too ambiguous to protect AI security researchers

Existing US laws tackling those illegally breaking computer systems don't accommodate modern large language models (LLMs) and can open researchers up to prosecution for what ought to be sanctioned security testing, say a trio of Harvard scholars.  The Computer Fraud and Abuse Act (CFAA) doesn't even really apply to prompt …

  1. johnrobyclayton

    Base it on consent

    If a Large Deep Learning Model is exposed with an Interface,

    And there is a login that has been granted,

    With a description of what the user can use the interface for,

    Then the user can do whatever has been consented to.

    It is then completely up to the owners/suppliers/administrators of the Large Deep Learning Model/Interface to determine what is consented to using input filtering or prompt prohibitions.

    If regulators place restrictions on what a Large Deep Learning Model/Interface can produce, then it is completely up to the owners/suppliers/administrators of the Large Deep Learning Model/Interface to comply with what regulators have consented to.

    If you build a big honking tool that can be used to perform the most horrendous of actions, then you are completely responsible for making sure that it does perform such horrendous actions.

    Unless something silly like a new amendment that gives every god fearing whatever the right to query any large deep learning model with any prompt they can come up with, no matter what tools they use to perhaps query the prompt interface at full auto.

    I have never seen a list of prompts that are prohibited on a Large Deep Learning Model/Interface. Until there is, it is open slather for users and all responsibility falls on the owners/suppliers/administrators of the Large Deep Learning Model/Interface.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like