How are GPUs used in brute force attacks?

  • I have read that GPUs can be used in brute force attacks? But how can this be done and is there a need for any other hardware devices (hard disks for instance)?

    Note: I'm more interested in web application security, but I don't want to put on blinders. I'm sorry if my question is ridiculous for you, but my hardware background isn't very good. I just know how basic components work together and how to combine them.

    Since I can't comment: As a for instance, I could calculate around 33 million MD5 hashes per second using John the Ripper with the CPU, and around 11.8 billion hashes per second using OclHashcat and the GPU. I tested this recently as part of a security class assignment.

    It helps killing time while CPU does bruteforce :P

  • Phil Lello

    Phil Lello Correct answer

    5 years ago

    I'm choosing to assume you're asking why it's a risk rather than how to hack.

    GPUs are very good at parallelising mathematical operations, which is the basis of both computer graphics and cryptography. Typically, the GPU is programmed using either CUDA or OpenCL. The reason they're good for brute-force attacks is that they're orders of magnitude faster than a CPU for certain operations - they aren't intrinisically smarter.

    The same operations can be done on a CPU, they just take longer.

    Thank you! I don't want to know how to hack, I just want to understand the idea. So, the software used is convincing the GPU that it is working on a computer?

    @MahmudMuhammadNaguib It uses the GPU for a non-graphics operation - which tends to be allowed by design these days (it used to be creative hackery). The process is initiated by a normal program.

    Thank you! This is very creative, and these people are real hackers! Do you think companies will modify the design in the future, so GPU's can't be used for this purpose, and is this possible (as graphical operations include some mathematical operations, too)?

    @MahmudMuhammadNaguib Absolutely not! While they can be used for brute-forcing the same capabilities can be used for tons of other useful things. The trend is that GPUs are getting better and better at doing things that aren't strictly speaking graphics. And this trend exists because they are being exploited more and more to do that stuff.

    @MahmudMuhammadNaguib I think the trend is to add GPUs onto the die with the CPU to support highly-parallel non-graphical operations. I don't see how it would benefit GPU manufacturers to limit non-graphics opertions. I'm a little out of my niche here though.

    I see, I respect specializations.

    Just to underscore @Bakuriu's comment, the CUDA API was created by nVIDIA especially to enable developers to use their GPUs for non-graphics purposes. They strongly encourage the creative uses of their GPUs because they hope to sell more of them as computer performance accelerators. It should not be considered hacking, this is all now mainstream use of GPUs to perform massively parallel computations.

    Since we're talking about Nvidia, look up some pictures of their Tesla cards--"These are built solely for computing, so many of these don’t even have a video output". People who work with them will be used to the idea, but a video card with no output ports on the back of it just looks so cool and different to me.

    @MahmudMuhammadNaguib Why would you deliberately prevent a GPU from being able to process things that aren't graphics? The fact they're called "G"PUs is just historical inertia.

    @MahmudMuhammadNaguib Or to use an analogy: you're watching people throw plastic bottles out of car windows, and you're asking why we don't modify the design of windows so only biodegradable rubbish can pass through them.

    @immibies Who said I would? I am just discussing! And I have mentioned that this is very creative, and that these people are real hackers!

    I'd just like to point out that this is the reason GPUs are considered superior for bitcoin mining - they excel at performing a basic repetitive task in a massively parallel format, like sending an update to each individual pixel on an HD monitor.

    The real answer for why GPU manufacturers wouldn't even consider limiting this use of their hardware is because it wouldn't make anything safer. GPUs are generally available and easy to install on a general purpose computer, which is their prime attractiveness, but custom ASICs can be built that are even faster and, so long as there aren't other considerations at play (like scrypt and memory usage) aren't out of line on cost - which is why most Bitcoin mining these days has moved to custom ASICs.

    Besides cryptography and graphics, doesn't anything in computing require mathematical operations?

    @DDPWNAGE Its not mathematical operations GPU's are good at; it is a particular kind of mathematical operations (ones that parallelise really well). Most computing doesn't do much with this kind of operations -- most of say a email program, or a webserver is squential code full of branchs and conditions -- these run best on the CPU.

    They are actually usually *slower* at doing the mathematical operation in question, not *faster*, in that it takes more time for them to do one mathematical operation, they just do *more of them at the same time*. GPUs win via parallelism, not via speed. On a batch of jobs, they can be faster, and cartographic attacks often rely on checking ridiculously large number of combinations of possibilities.

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM