Ai breakthrough "could lead to 2018 Singularity"

This topic was created by Conundrum1885 .

  1. Conundrum1885

    Ai breakthrough "could lead to 2018 Singularity"

    https://www.linkedin.com/pulse/something-monumental-just-happened-computing-one-paid-hiesboeck?trk=mp-reader-card

    Not overhyping this honest but my 2016 estimate of 21/2/18 at 7.02 am for the Singularity might not be entirely inaccurate.

    I have actual data here suggesting that a quantum computer could achieve some level of self awareness in conventional non-quantum hardware (eg DDR3/4) and this model shows that mapping out the memory to locate regions of maximum beneficial instability would work.

    A conventional annealing setup probably wouldn't be enough, this would require the 50 qubit system but still a significant advance if the GTX1080Ti cards with 12GB RAM were used.

    32 cards giving a total of 384GB RAM and running at maximum output would have the same intelligence as a two year old in certain tasks such as image recognition.

    The DDR4 modification of address line randomization is a non issue as the algorithm is not in fact random but by design entirely predictable and I am in the process of modifying the system to work around it.

    Those Movidius AI "sticks" might also be handy for a lower budget version as they are based loosely on Bitcoin miners which only recently became public knowledge and explains >8 month delay on release.

    In this case 4 1080Ti's would be needed with a custom rig containing two AMD 1950X 18 core chips and 2C/4S per node, maxed out at 128GB RAM.

    1. jake Silver badge

      Re: Ai breakthrough "could lead to 2018 Singularity"

      You are aware that Kurzweil writes a form of speculative fiction known as pseudoscientific fantasy, right?

  2. Conundrum1885

    RE. Re.

    I am aware that a lot of Kurzweil's writing is speculative science fiction, but up until maybe 2007 256GB on a fingernail was so ludicrous that people would question its existence by 2016.

    The "wall" for memory chips was believed to be 4GB/in2 but this has since been revised.

    Intriguingly my old Samsung Core 2 laptop seems to have a lot more processing power than first thought, those "wasted" 64 bit circuits are apparently being used despite running in 7 x32 mode.

    Looks like this is why only 4GB is allowed, it would compromise the custom modded BIOS.

    Once tried 7 x64 in error and it had issues, perhaps this has been fixed now?

    The idea of using a quantum computer to "juice up" existing hardware is intriguing and worthy of further study.

    1. jake Silver badge

      Re: RE. Re.

      Quantum computers will get here. Eventually. But they aren't going to be even close to "a chicken in every pot" status, and probably won't be for a couple decades at the earliest.

      AI is probably 20 years away. As it was 20 years ago. And as it was 20 years before that, when I was at SAIL ... I'm willing to bet that 20 years from now it'll still be 20 years away.

  3. Conundrum1885

    Re. Skynet

    Open letter at the moment about AI weapons.

    But yes I think there needs to be a discussion about laws prohibiting certain types of "killer robots" certainly incorporating a version of Asimovs "Three Laws" from the get-go.

    A robot should have the right to refuse a human order if in its own judgement the action would result in substantial loss of life and violate international law.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020