back to article White hat hacker AI bots prepare for DARPA's DEF CON cyber brawl

The research wing of the US military has picked the seven teams who will compete to build machine-learning software that can find and patch bugs automatically to fend off hackers. The DARPA Cyber Grand Challenge will be held at the DEF CON hacking conference next month. The agency has put up $2m in prize money in the unlikely …

  1. frank ly

    If

    "The cyber reasoning systems will also be networked so they can examine their competitors' software for flaws ... "

    Imagine if they were let loose on the internet.

    "Walker said DARPA was encouraging hackers to use the source for their own use."

    Aha!

  2. Destroy All Monsters Silver badge
    Windows

    "cyber reasoning system"

    In my times, these were called "expert systems".

    What cyber marketing guru cyber invents these cyber word monstrosities?

  3. TeeCee Gold badge
    Alert

    ...to find and address exploitable flaws hidden in DARPA-supplied code.

    Are they looking at the code itself? If so; DANGER WILL ROBINSON!

    That way lies code security testing which only ever finds previously known bug types, or a world of zero-day heaven as I like to think of it.

    1. Destroy All Monsters Silver badge
      Headmaster

      I didn't know that bugs come in "types".

      You had BETTER look at the code itself. It's damned easier to find a problem if you have the code than finding the same problem if you don't. "Hacks" don't appear from nowhere (unless its' 1000 monkey hacking). The attacker will get the spec, then think a) about how the spec could fail or b) how the developer could fail in implementing the spec. Attacks via b) are best reduced by formal methods or code inspection.

      1. Richard 12 Silver badge

        Makes no real difference to the machine

        After all, bytecode and machine code are not different to the source as far as the executable code paths are concerned.

        The comments and variable names are extremely useful to humans, but not so much to machines unless anyone is trying for natural-language processing of reading the comments and variable names to infer the intended results and identify places where the code doesn't match the comments.

        They only need the source code because they are supposed to create patches, and humans find it much easier to examine source code for correctness.

        1. Mainway
          Devil

          Re: Makes no real difference to the machine

          Makes no real difference to the human element either, thanks largely to the security apparatus wanting to hack your PC there efforts have weakened security across the board, I dont need Heart-bleed, Drown, Beast or Freak, I just need a copy of CA's from Firefox and to be piggy backing on your connection when you login to the local online bank and the password - regardless of encryption along with most of the online session can be hijacked very simply. Golden master-keys they can keep em, a small cluster of 50.100 Ghz aught to be more than enough to break most types of encryption relatively easily, more so with de-random and de-crypt and a word-list filled with rainbow tables!

  4. Mainway

    Re: Entertained...

    Yes incredibly!

    To quote other more experienced developers:

    "GNU is, and always has been, a political movement that writes software as a means to an end!"

    Dynamic Linking is an infinite source of complexity, security leaks, incompatibility, unreliability etc. and yet many perceive it as "good" or even "necessary".

    Dynamic linking allows fixing bugs in libraries / updating libraries in one place.

    It also allows introducing new bugs whose cause might be hard to find.

    Programs are not self-contained, complicating debugging and deploying.

    Versioned symbols don't allow fixing bugs in one place.

    Most programs don't benefit from library updates.

    Dynamic linking is secure.

    Few have provided a viable model how dynamic linking is supposed to be secure. But many exploits are actually possible just because of dynamic linking, just look at your favourite exploit site.

    "Open Source doesn't magically make anything good. Many open source people are idiots!"

    Gnu/Linux:

    Linux for the most part tries to adhere to Unix principles, this is good. But it also keeps stuff from Unix which should have long been replaced. (terminal madness, anyone?)

    And it follows POSIX and all open source best practices.

    Clipboard handling totally sucks so incredibly much. Not even lobotomized monkeys on crack could fuck up this badly.

    Distributions are incompatible. Most Software has a Windows binary, an OS X binary and a Debian binary, a Ubuntu binary, a Fedora binary, ...

    Countless duplicated folders, binaries in /bin, /sbin, /usr/bin, /usr/sbin, /usr/local/bin, /opt/bin, ....

    There is no canonical way, ever.

    Backwards Compatibility, what's that? Compare running twelve year old Windows 98 software on today's Windows to running two year old software on today's Linux.

    Linux people love to replace working solutions with giant broken cluster-fuck solutions (see also: Network Manager).

    Dynamic linking, even worse than usual.

    Linux Desktop Environments usually try to emulate Windows which sucks. Compare that GNOME's thing to the Windows registry.

    GNU shit like Auto-tools all over the place.

    Cross compilers, what's that.

    How does /dev work today?

    OSX:

    So far I wasn't able to stand this long enough to seriously comment on it.

    /Library/Framework/OpenGL/2.0/Framework/lib/OpenGL/2.0/lib/Framework/lib/GL/libGL.so

    hostname is stored in xml in /System/Libraries/Something/Computer/Buttfucking/Documents/Properties.plist

    [obj-c sucks]

    That GUI. What the fuck. You can't maximize windows reliably? No task bar or anything (lolexpose)?

    Fucking menubar at the top. Trying to avoid locality as much as possible.

    I have to start the file manager to start programs? What the fuck fuckedy fuck fuck fuck?

    Windows:

    Everything is giant, rigid, monolithic block.

    There is no simple way to combine two things.

    In Unix you solve problems by combining existing programs.

    In Windows you just add another monolithic block.

    Shitty GUIs all over the place. Seriously.

    As a result, it's hard to automate things.

    Naming sucks. Always. C:\Users and Settings, seriously? (Yes, it has been fixed)

    As a hilarious example: syswow64 contains x86 DLLs, system32 contains amd64 DLLs.

    Special cases abound.

    Compare Unix daemons to Windows services.

    Compare Unix X11 programs to Windows magic graphical programs.

    Compare Unix text config files to the Windows registry.

    The system as a whole is insanely complicated and impossible to understand.

    The whole system is impossible to debug. For many problems the only solution is "reinstall".

    Swapping out the kernel. Etc..etc..etc

  5. Mainway
    Devil

    A case of when

    If you have Windows 10 or OS 10 or even Android across any part of your enterprise, then your already fucked! If you have Linux or BSD on INTEL or AMD CPU's then once again your already fucked, its not a matter of if - only and matter of when!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like