back to article Road to nowhere: UK plans for an 'AI assurance industry' but destination is unclear

The UK government's Centre for Data Ethics and Innovation (CDEI) has published a "roadmap" designed to create an AI assurance industry to support the introduction of automated analysis, decision making, and processes. The move is one of several government initiatives planned to help shape local development and use of AI – an …

  1. amanfromMars 1 Silver badge

    Harsh maybe ..... but perfectly true and undeniable

    Democratically, publicly elected Government, by its very own nature [chosen by any Tom, Dick or Harry or Jane over voting age who can be bothered to choose from a very small mixed and motley crew of wannabe cherished members of a handful of predominantly personally and/or privately funded political parties] does not have the necessary brainpower to effect any influence at all on the inexorable rise of the virtual AI Machine.

    And to even imagine, let alone think that they can/do, is evidence of the delusionary state in which they exist and would seek to further infest and infect the future with.

    And all other well enough known and practised forms of government suffer from the same knowledge and experience deficit too.

  2. Version 1.0 Silver badge
    Joke

    "but destination is unclear"

    No, the destination is clear, they about to make it illegal to protest anything that the government does so AI will be used to prevent anyone treating a prime minister as a dictator. This is just going to generate AI assurance industry funding and ensure a dictator (AI correction and automatic icon addition) wonderful oven ready government re-election.

  3. Eclectic Man Silver badge

    AI and Ethics

    It is too easy to criticise attempts to put ethics at the heart of the AI debate, but it is also essential to have ethics as the main condition. Many articles here on the Register and elsewhere have highlighted injustices by AI and 'Computer says "no" ' decisions, which could not be challenged because there was no way to legally examine the reasoning.

    Examples I recall include:

    Police use of criminality predicting software, so that when a neighbourhood experiences a crime, it gets more police patrols, finding more crime in a self-fulfilling prophecy.

    Denial of bail as a person lives in a 'high crime' area and is therefore deemed more likely to offend if released than kept in custody discriminating against people from black neighbourhoods.

    Training of AI systems on a dataset obtained almost exclusively from white, middle-class males, and not considering that a hispanic or black female may therefore not be treated fairly, especially facial recognition systems.

    Whatever the ethical criteria turn out to be they really must be robust, but, from a government that has indicated its desire to repeal the UK's Human Rights Act and maybe leave the European Court of Human Rights, it does seem that we need to keep a close watch on them.

  4. elsergiovolador Silver badge

    Classic

    If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2022