back to article Boffins whip up SELF-WIRING chip

Boffins have developed a new nanoscale material that could potentially allow computers to rewire themselves according to the user's needs. Scientists at Northwestern University decided to look at the problem of teeny-tiny circuits in ever-shrinking electronic devices in a new way, by coming up with a material that can be a …

COMMENTS

This topic is closed for new posts.
  1. D. M
    Terminator

    Cool, and here is an idea

    Apple's new meaning of locked down eco system - "you cannot rewire your chip".

  2. Naughtyhorse

    cool science

    but if you _have_ to refer to someone as 'top brain', then shouldnt it be spelled 'top brane'

    not to do so it a bit of a chiz if you ask me

  3. Yag
    Terminator

    mmmh...

    I, for one, welcome our new self-evolving silicon overlords.

    1. ledmil
      Terminator

      Re: mmmh

      Noooooooo......Beat me to it :)

  4. defiler

    Bizzarre

    So they're using electricity to reconfigure a homogenous slab of material, and then routing electricity through it without disturbing the "paths" created by the first electricity.

    Nope - that's beyond me. Pretty clever though. But can it be milked?

    1. Anonymous Coward
      Anonymous Coward

      re. Bizzarre

      Beyond me too; I think Ian Dury expresses this best:

      "There Ain't Half Been Some Clever Bastards"

  5. Will Godfrey Silver badge
    Alien

    I, for one...

    welcome our evolving, self-modifying overlords!

  6. aldude
    Unhappy

    New?

    Erm.... FPGA?

    1. diodesign Silver badge
      Happy

      I appreciate your point

      But I gather this technology will be more flexible than configuring FPGA blocks - quite possibly smaller too. Plus, it's such a neat trick...

    2. Jason Bloomberg Silver badge

      Not new, but evolutionary

      Or perhaps a leap forward.

      FPGA are mainly digital in nature and this seems more useful for analogue signal processing. We can already switch signal paths between different types of circuit but this looks to be a technology which actually shapes the circuit to what is wanted.

      The main advantage would seem to be less components needed; each component can be any type of component rather than needing to have one of each and deciding which to use. Thus more usable components can be fitted in a chip.

      1. annodomini2
        Thumb Up

        And as a result hopefully more power efficient than FPGA's

  7. Antoine Dubuc
    Terminator

    what could possibly go wrong?

    There's goes the three laws of Robotics.

    1. Will Godfrey Silver badge
      Happy

      I think it was Carl Sagan who once said that super intelligent computers would probably want to keep us as pets - something he'd be quite happy about as pets are generally treated much better than people.

      1. Rob
        Go

        As long as...

        ... these machine overlords set-up the RSPCH, cause there's always the few that will abuse their pets.

  8. Jim Carter
    Terminator

    Probably a bit A Man from Mars-ish...

    But this could be a significant step on the road to the artificial intelligence singularity, given that organic brains have the capability to re-wire themselves to form new congnitive connections.

  9. launcap Silver badge
    Devil

    I Have No Mouth, and I Must Scream

    Super intelligent computers who keep us as pets?

    See story referenced above. I remember reading it as a young and impressionable youth and the impression it left is probably why I went into computing..

    1. Daniel B.
      Terminator

      You wanted to build A.M.? Yipes. Just don't hook it up to the nukes...

  10. Gideon 1

    New?

    Erm - Field Effect Transistor?

  11. JimmyPage
    Boffin

    The holy grail ?

    ISTR from 25 years ago, that analogue computers were very fast and good at somethings (particularly with graphics ... getting an analogue circuit to draw a circle - or thousands of circles - on screen is blazingly fast, compared to digital) control engineering being on example we used. However they were tricky and cumbersome to programme, as you needed to rewire them every time.

    So this news might lead to a combination of digital computers that can design and implement analogue components internally to provide the most accurate and fast applications. Maybe we are moving closer to a robotic world.

  12. TwistUrCapBack
    Terminator

    Is it just me ???

    That feels a bit uneasy about where this might lead ??

    20 years from now the lead researcher (brain) is going to get a visit from a large man who peels back the skin on his arm and explains why he "shouldna dun that" ..

  13. Anonymous Coward
    Anonymous Coward

    RoTM?

    This got RoTM written all over. The reason WHY it was not tagged as such from the beginning is making me shudder in fear.

    Ok, did anyone *notice* the metallic vulture in Transformers - Dark of the Moon? The one that disguises as a desktop machine?

    I'll leave it to you now...

  14. The last doughnut
    Unhappy

    Oh dear

    Someone has actually quoted a mindless action film in the comment section. Time to delete my Register bookmark.

  15. Anonymous Coward
    Thumb Up

    hmmm

    I got told off by my GCSE computing teacher for writing some self-modifying Z80 assembly code. Now you're telling me I can have self-modifying hardware to run my self-modifying code on too?! Cool!

  16. Stephen W Harris
    Mushroom

    More viruses

    Current viruses may attack the CPU by loading a new microcode layer, but this is cleared on reset. I guess the next generation of viruses will physically rewire your CPU for you! Why am I envisioning some Neuromancer scenario where ICE wipes your CPU block clean?

  17. Nick Sargeant

    The grey goo cometh ..

    Of course one possible application for this is for tamper-proof, unhackable hardware that can destroy itself if it falls into the wrong hands .. or if you forget to pay the monthly licence fee to continue to use <appliance> for the purpose you thought you'd bought it outright for ..

  18. nyelvmark
    Thumb Up

    This is very exciting.

    I can think of many applications of programmable analogue circuitry which could make a lot of money. If you're looking for long-term investment, I would look closely at this.

  19. Toastan Buttar

    Xzibit

    'Sup dawg. I herd you like wirin' yo computer chips, so I put a rewirable chip in yo computer so you can rewire your computer while computin' yo rewires.

  20. Robert E A Harvey

    Old friends

    Wouldn't Ivor Catt have loved this?

  21. Anonymous Coward
    Anonymous Coward

    fun speculating

    ISTR there was this compiler that generated self-modifying code. The ratio of modified code to modifying overhead was about one to ten or something to that tune. Wonder what this'd do.

    Maybe this will turn out to be a great RoTM-enabler, and maybe not. We already have "neuron like" constructs on chips, but not nearly enough to build a brain-sized constellation from.

    Then again, our (usually technically human or at least persons, say corporations and quangos and such) overlords are already vying to take control from the people. So it's probably more of a "if we let them" issue. You know, privacy, data security, governance, building safeguards and such right into policies, providing redress, and not accepting crap from our rulers, ever, whatever they might be made off. That last tidbit really is quite a lot irrelevant, so maybe we oughtn't overlook the rest.

    1. nyelvmark
      Boffin

      Self-modifying code.

      So, if

      x = rand(); isn't good enough, you can try:

      srand(rand()); x = rand();

      Unfortunately, computers continue to be deterministic. If you'd like a computer program with a non-deterministic output, I'd be happy to write one for you (I will need a non-deterministic input, but that's easy enough to simulate by simply measuring the time between user inputs).

      "Self-modiying code'" is as valid a technique as recursive code. Neither are widely used, because they reduce readability (and thus make debugging tough), and performance gains are minimal in most use-cases.

      HTH.

This topic is closed for new posts.