How to estimate the time needed to crack RSA encryption?

  • How to estimate the time needed to crack RSA encryption? I mean the time needed to crack Rsa encryption with key length of 1024, 2048, 3072, 4096, 5120, 6144, 5120, 7168, 8192, 9216, 10240, 11264, 12288, 13312, 14336, 15360, and 16384?

    1024-bit RSA encryption cracked by carefully starving CPU of electricity See this article... http://www.engadget.com/2010/03/09/1024-bit-rsa-encryption-cracked-by-carefully-starving-cpu-of-ele/

  • See this site for a summary of the key strength estimates used by various researchers and organizations.

    Your "512-bits in 12μs" is completely bogus. Let's see from where it comes. 1999 was the year when the first 512-bit general factorization was performed, on a challenge published by RSA (the company) and called RSA-155 (because the number consisted in 155 decimal digits -- in binary, the length is 512 bits). That factorization took 6 months. At the Eurocrypt event organized the same year (in May; at that time the 512-bit factorization effort had begun but was not completed yet), Adi Shamir, from the Weizmann Institute, presented a theoretical device called TWINKLE which, supposedly, may help quite a bit in a factorization effort. It should consist in a huge number of diodes flashing at carefully selected frequencies, in a kind of black tube. Shamir brought a custom device which, from 10 meters away, looked like a coffee machine. He asked for people to switch off the light, so that the Eurocrypt attendee could marvel at the four red diodes flashing at invervals of 2, 3, 5 and 7 seconds. Ooh! and Aah! they went, although the actual machine, would it be built, would require a few millions of diodes and frequencies in the 10 or 100 gigahertz. So the idea is fun (at least for researchers in cryptology, who are known to have a strange sense of humor) but has not gone beyond the theoretical sketch step yet. Shamir is a great showman.

    However, TWINKLE is only "help". The best known factorization algorithm is called the General Number Field Sieve; the two algorithms which come next are the Quadratic Sieve and the Elliptic Curve Method. A 512-bit number is out of reach of QS and ECM with today's technology, and a fortiori with 1999's technology. GNFS is very complex (mathematically speaking), especially since it requires a careful selection of some critical parameters ("polynomial selection"). So there must be an initial effort by very smart brains (with big computers, but brains are the most important here). Afterward, GNFS consists in two parts, the sieve and the linear reduction. The sieve can be made in parallel over hundreds or thousand of machines, which must still be relatively big (in RAM), but this is doable. The linear reduction involves computing things with a matrix which is too big to fit in a computer (by several orders of magnitude, and even if we assume that the said computer has terabytes of fast RAM). There are algorithms to keep the matrix (which is quite sparse) in a compressed format and still be able to compute on that, but this is hard. In the 512-bit factorization, sieving took about 80% of the total time, but for bigger numbers the linear reduction is the bottleneck.

    TWINKLE is only about speeding up the sieving part. It does nothing about the linear reduction. In other words, it speeds up the part which is easy (relatively speaking). Even a TWINKLE-enhanced sieving half would be nowhere near 12μs. Instead, it would rather help bringing a four month sieving effort down to, say, three weeks. Which is good, in a scientific way, but not a record breaker, especially since linear reduction dominates for larger sizes. The 12μs figure seems to come from a confusion with an even more mythical beast, the Quantum Computer, which could easily factor big numbers if a QC with 512 "qubits" could be built. D-Wave has recently announced a quantum computer with 128 qubits, but it turned out that these were not "real" qubits, and they are unsuitable for factorization (they still can do, theoretically, some efficient approximations in optimization problems, which is great but basically not applicable to cryptography, because cryptographic algorithms are not amenable to approximations -- they are designed so that a single wrong bit scrambles the whole thing). The best "real" QC so far seems to be the prototype by IBM with, as far as I recall, has 5 qubits, enabling it to establish that 15 is equal to 3 times 5.

    The current RSA factorization record is for a 768-bit integer, announced in December 2009. It took four years and involved the smartest number theorists currently living on Earth, including Lenstra and Montgomery, who have somewhat god-like status in those circles. I recently learned that the selection of the parameters for a 1024-bit number factorization has begun (that's the "brainy" part); the sieving is technically feasible (it will be expensive and involve years of computation time on many university clusters) but, for the moment, nobody knows how to do the linear reduction part for a 1024-bit integer. So do not expect a 1024-bit break any time soon.

    Right now, a dedicated amateur using the published code (e.g. Msieve) may achieve a 512-bit factorization if he has access to powerful computers (several dozens big PC, and at least one clock full of fast RAM) and a few months of free time; basically, "dedicated amateur" means "bored computer science student in a wealthy university". Anything beyond 512 bits is out of reach of an amateur.

    Summary: in your code, you can return "practically infinite" as cracking time for all key lengths. A typical user will not break a 1024-bit RSA key, not now and not in ten years either. There are about a dozen people on Earth who can, with any credibility, claim that it is conceivable, with a low but non-zero probability, that they might be able to factor a single 1024-bit integer at some unspecified time before year 2020.

    (However, it is extremely easy to botch an implementation of RSA or of any application using RSA in such a way that what confidential data it held could be recovered without bothering with the RSA key at all. If you use 1024-bit RSA keys, you can be sure that when your application will be hacked, it will not be through a RSA key factorization.)

    +1. I didn't want to claim that the 512-bit factoring thing was bogus since I know far less about quantum (or the whole area in general) than you do, but I had a strong feeling it was. I also had a fair idea factoring algorithms were tailored per number, but not to the extent you describe, as my number theory just isn't that good yet. So +2 if I could.

    +1, @Thomas Pornin: A definitive answer from a real cryptographer.

    I haven't been around this site long, but I'm getting *really* good at guessing which questions are going to be answered by that bear.

    Some years later, 512 bits is down to "about 7.5 hours": http://blog.cryptographyengineering.com/2015/03/attack-of-week-freak-or-factoring-nsa.html (Typical users still aren't anywhere near breaking 1024-bit keys, and won't be within the remaining time of the "ten years" of the answer. The sort of organizations who might possibly be don't make public announcements: http://crypto.stackexchange.com/a/1982/5249)

    Tiny nitpick: IRRHYM "one (big PC) *chock-full* of RAM", not "one clock (which is a device to tell or measure time)".

License under CC-BY-SA with attribution


Content dated before 7/24/2021 11:53 AM