back to article AI needs a regulatory ecoystem like your car, not a Czar

The world will need an ecosystem of interlocking AI regulators, pundits argued at the ATxSG conference in Singapore today. Across several sessions, the advent of generative AI was hailed by many at the conference as a defining moment in human history, on par widespread use with fossil fuels or the industrial revolution. The …

  1. that one in the corner Silver badge

    If I hammer in a nail with a lathe...

    I don't get to blame the lathe manufacturer when I drop it on my foot.

    > Almost every panel The Reg attended involved experts having a chuckle at the lawyer's folly.... Almost every panel The Reg attended involved experts having a chuckle at the lawyer's folly.

    So almost every panel agrees with my sentiment and yet

    > "It's even more important when non-experts are using these tools that there is accountability, non-liability. If you aren't an expert, where do you go to check the tool? The onus should be on those creating it."

    These - things - are toys and, as a professional, you have a duty to know that (your professional body sure as hell has a duty to you to make that clear, preferably in the rules of conduct) and be prepared to accept liability.

    What those creating the LLMs should be made to do is state clearly, evey time,"this is for entertainment purposes only" - preferably within the generated text, multiple times! At least make the pillocks misusing it have to do some deliberate, knowing, editing so there can be no claims of "I didn't know".

    1. jmch Silver badge

      Re: If I hammer in a nail with a lathe...

      "What those creating the LLMs should be made to do is state clearly, evey time,"this is for entertainment purposes only" "

      Everyone developing these LLMs are doing it to sell access to them commercially, and very few people would pay for LLMs to get entertained. They want to position the product as all-knowing to their clients and pretend that they make their customers aware of the flaws. Tesla "Full self drive" springs to mind. They have to be forced to get the caveats out of the small print and bring them front and centre.

      1. nobody who matters

        Re: If I hammer in a nail with a lathe...

        ".......What those creating the LLMs should be made to do is state clearly, evey time,......." this is NOT Artificial Intelligence.

        There, fixed that for you.

        I have my doubts whether actual AI will ever arrive - if these people keep insisting on referring to things like LLMs as being AI, and trying to pretend they can do things which they are clearly not actually capable of doing (in other words effectively trying to run before they can walk), it would not surprise me in the least of the Human Race destroys itself with one of these half-baked entities long before we get anywhere near achieving a machine/computer/programme which can actually think for itself, and is self-aware (conditions which it would surely have to satisfy to be considered 'Artificial Intelligence').

        1. Nick Ryan Silver badge

          Re: If I hammer in a nail with a lathe...

          I am sure that AI will arrive sometime... but what we have right now is a tidal wave of marketing bullshit around machine learning and abusing the AI/Artificial Intelligence terms until they have no meaning whatsoever. The majority of the "AI" implementations are nothing more clever than a developer putting in a couple of extra IF statements. Intelligence requires understanding and context and this is something that is utterly missing in all of these marketing-AI implementations.

          While ChatGPT and similar are very clever uses of technology, there is absolutely no intelligence within them. This leaves them as AS - Artificial Something... or Machine Learning and Machine Output, which isn't nearly as snappy. There is a session state which helps refine the output of the end session, which is a big step up from simpler/previous systems,

          In the end though, the ancient maxim still applies... GiGo - Garbage In, Garbage Out. If the source data is crap then there's a high likelihood of the output also being crap.

    2. jdiebdhidbsusbvwbsidnsoskebid Silver badge

      Re: If I hammer in a nail with a lathe...

      Agreed. Current AI systems are just tools. Yes, the manufacturers of those tools should be legislated to ensure that their tools are safe to use, but the onus is on the user to make sure they are used safely and properly for the job in hand.

      A bit like cars, where regulation makes sure they are safe when used properly, but can't prevent people using them in unsafe ways. I prefer the analogy of hand tools: AI is a tool like a power drill. The supplier has to ensure that in normal use the tool is safe and won't explode in your face, but the supplier can't be responsible if the user has poor work practices and builds a poorly constructed house with the tool. The example of lawyers using AI and not checking the output is an example of this poor work practice.

      Maybe what we need is training to understand what the safe uses of AI are, like knowing that a power drill is just that and not anything else.

      1. Paul Crawford Silver badge

        Re: If I hammer in a nail with a lathe...

        the onus is on the user to make sure they are used safely and properly for the job in hand

        And how is a member of the public (in the non-IT expert sense) going to know of, or understand, the limitations?

        Most other products are made to, and sold as, specific standards that folks can reasonably expect to take for granted. Do you go to buy a car and have to learn the limits of when the brake pedal will or will not stop the thing?

  2. codejunky Silver badge

    Erm

    Isnt this where the market solves the problem? 'Look at my all singing and dancing widget' which some people use and realise it is or not useful. And people try it out in various situations which can spring to uses not thought of by the creator. Or we can get a bunch of baby kissers to try to make rules on something they wont understand.

    1. Anonymous Coward
      Anonymous Coward

      Re: Erm

      Isnt this where the market solves the problem?

      Hmm. Nope.

  3. Dan 55 Silver badge

    No single entity tells you to stay in your lane

    Apart from the Department of Transport.

    he suggested the tech could be regulated by professional standards, social norms that define boundaries, and education – reminiscent of a "Big Society" approach.

    I guess he's arguing for that approach because it's been comprehensively proven to have failed.

    1. Nick Ryan Silver badge

      Some of the arguments feel similar to the concept of trickle down economics...

  4. Adrian 4

    Artificial

    Artificial, in the same sense as artificial flavouring or artificial grass, is what it is. Something that has the appearance of an object, but isn't it. However the term raises higher expectations than that because it's been a target for so long. The current crop isn't what we hoped for, even if it can sneakily be described like that.

    A better name for the current effort would probably be Fake Intelligence.

    What I think we're really looking for is Machine Intelligence. Actual intelligence that's good for something, but done by machines.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like