
Nothing new here.
Sure, they're using graphics cards.
But add on processor boards have been around for years.
Nvidia's GeForce 8 series of graphics chips can be used to crack Windows NT LAN Manager (NTLM) passwords 25 times more quickly than was previously possible, security software developer Elcomsoft has claimed. The Russia-based company this week announced the second major release of its Distributed Password Recovery application …
The GPU is a highly specialized mathematical processor. Whereas a typical cpu has generic functions inherent because of its multiple applications. The reason the gpu is so much faster is that its core is specialized for perfoming highly complex mathematical permutations in nanonseconds. I think many of the people who are saying that this is nothing new are missing the point that while "High-end professional graphic cards with many DSPs working parallel existed even many years ago on IBM and Sun systems AFAIK, and they were really fast although very expensive." The question is whether the tools were available to perform the same functions discussed in this article. Furthermore, the expense is no longer there. Simply buy several cheap GeForce 8 cards and you have a hella system for cracking passwords.
I do think as one poster brougth up that if this could be applied to the several distributed applications(folding@home, seti@home etc etc) it could exponentially increase the rate at which it is processed
Because your OS needs a wide range of processor functions from a general purpose processor, the one on your motherboard, whilst the GPUs contain processors optimised to doing some quite specific maths functions. These are analogous to RISC processors, because they only do a few things well they can be much smaller so you can get more of them on a chip for the same cost and power, they are very very good at the instructions they are optimised to but have to emulate the other instructions which is slow and difficult. If your motherboard only had GPUs it would do 3D graphics very fast but your filesystem and other general purpose jobs would run so slowly that the overall effect would be to slow down. This is why GPUs on dedicated graphics accelerators are a good offload from the general purpose processor, to each their own tasks. This is also why things like TCP offload engines for network cards or dedicated game physics processors work well, as they are optimised to deliver one type of job which a general purpose processor is not very efficient at doing.
The argument for GPGPUs is that a graphics processor is basically a really fast floating point maths machine, exactly what you need for applications like climate modelling or finite element materials stress analysis. This allows you to build a very fast grid computer using cheap, commodity general purpose processors on the motherboards to manage tasks, networking, memory and storage but containing lots of GPUs to do the hard maths stupidly fast.
Problem in the past with add-in FPGA/DSP/Transputer boards was that they were so complicated to program with such poor tools that it took a couple of years to get an application running on them.
Then the board would either go out of production or not be compatible with the next OS release or Moore's law caught up with you.
If these use regular graphics cards, normal dev tools and keep compatibity on new cards it might be useful.
Given the same die size, you can either have a few fast cores or a lot of slow cores.
For tasks with a lot of good parallelism, it's much better to have a lot of slow cores. This means graphics, password cracking, etc.
For tasks with very little parallelism, e.g., your browser, e-mail program, etc., it's much better to have one (or two, or four) cores that are engineered to run one thread as fast as possible.
I don't know if the "cores" on a GPU today are general-purpose enough to run an OS, but I'm sure it wouldn't be hard to get them up to that point, given enough motivation. But then you'd be stuck running most of your apps as if you had a 200MHz Pentium.
> ...to crack my password? Assuming they knew my length, and the character sets I use (e.g. upper/lower case, numbers), there are still 201977518437757778375221238472081529012864009105715786231578624 different possibilities.
Jach, if they knew your length, they'd be too busy laughing even to bother...
... and for anyone to make a post like that, I think that gives us a strong clue about your length... :-)
If the stored form of the original password is just a cryptographic hash of some sort, a brute-force approach would possibly yield a password (or various passwords) different from the original but which will work anyway by hashing to the same value.