remix logo

Hacker Remix

Transitioning the Use of Cryptographic Algorithms and Key Lengths

91 points by nabla9 1 day ago | 37 comments

tptacek 1 day ago

One rationale seems to be the standardization of PQ cryptography and thus the ability to go directly from weaker cryptography to PQ, rather than in 2 steps (112->128->PQ).

On the chopping block:

* ECB (\o/)

* Triple DES (TDEA)

* Finite field DSA (for new signatures)

* ECDSA at strengths lower than 112 bits

* RSA below 2048 bits

* RNGs, HMACs, HKDF, PBKDF and hashes based on SHA1 and the truncated 224-bit SHA-2/3 modes

No big surprises. The 224's are interesting, because folklorically they have value in hash constructions where resistance to length extension is useful. In practice, everyone just uses HMAC anyways.

*

deknos 1 day ago

On the one hand i am glad that ECB dies officially as a mode on the other hand i wonder what NIST officially recommends when you want to encrypt data that's shorter than one block. xD

regarding finally transitioning away from SHA1: about fucking time :D

adrian_b 20 hours ago

All other modes are valid for short data.

For instance the CTR mode can be used to encrypt any number of bits, down to a single bit.

The problem of the other modes vs. ECB is that they require the generation and the transmission of an "intialization vector", i.e. either a counter value or a random number, depending on the mode, so besides the short encrypted data a longer whole block must be transmitted. This can be avoided only when a set of small data are considered as parts of a long sequence of encrypted data, so the encryption mode is not reinitialized at each new message, but the last state is remembered.

ECB is a valid encryption mode only when it is used to encrypt random numbers having the length of the block (or other kind of data for which there is a strong guarantee that there will be no repeated values). It is secure for challenge-response authentication, if the challenges are unpredictable random numbers. ECB would be a perfectly secure method for encrypting other encryption keys, which must be random, except that one might want to encrypt together with the values of the keys other data, such as identifiers or error detection codes, in which case ECB could not be used to encrypt the additional non-random data.

tptacek 1 day ago

Any other mode? You can't preserve the original length if you're authenticating anyways.

js2 1 day ago

PQ: post-quantum for anyone else who didn't know.

PeterWhittaker 15 hours ago

I'm surprised to see symmetric algorithms in this list. It's been a while since I worked adjacent to the field (I'm not a cryptographer but spent a lot of time working with them in a past life), but my understanding is that PQ refers to replacing those algorithms that are vulnerable to advances in quantum computing, e.g., public key algorithms, such as RSA, that use relative primes and are therefore subject to attack by efficient implementations of Shor's algorithm.

AIUI, symmetric algorithms such as 3DES are not subject to these attacks, but my understanding could be wrong.

Care to enlighten?

tptacek 14 hours ago

Both ECB and TDEA are dangerously outmoded even if quantum cryptanalysis is never realized; ECB because you can see penguins through it, and TDEA because of the 8 byte block size.

PeterWhittaker 13 hours ago

Ah, OK, thanks. So part of the broader efforts to modernize, not in response to PQ threats.

sidewndr46 1 day ago

Whew, I was getting nervous. A place I worked at had a developer implement Triple AES. I'd hate for them to have to refactor it.

adrian_b 20 hours ago

"Triple AES" sounds as something insecure if it is similar to triple DES. (Insecure in the sense of providing a small additional strength obtained with a big increase in the time and energy needed for encryption/decryption.)

Only amateurs would choose to implement a "Triple AES", so it is very likely that they will also write a buggy implementation. Triple DES has not been used because it was a good strengthening method, but only because it could be used with unmodified hardware modules designed for simple DES. When a cipher strengthening is done in software, there are much better methods.

The best way to strengthen AES above the standard AES-256 is to double the block length from 128 bits to 256 bits. Increasing the key length over 256 bits is much less useful, because the key length is not the weakest point of AES-256. A 256-bit key is strong enough even against quantum computers, but short 128-bit blocks can be a vulnerability in certain applications. The key schedule algorithm of AES, which converts the cipher key into a set of round keys, is mediocre, so the length of the cipher key is the least important concern about the strength of AES.

The original Rijndael proposal had a stronger variant with 256-bit blocks, which has not been retained in the standard. Nevertheless, it is easy to implement it with the Intel/AMD AES instructions or with the Arm Aarch64 AES instructions. Intel has even published an application note describing how to do this, when the AES instructions have been introduced in the Westmere CPUs.

After increasing the block length, increasing the number of rounds can provide additional strengthening. Another choice would be to replace the standard key schedule algorithm with a stronger non-standard algorithm (i.e. one providing more random round keys). Increasing the key over 256 bits provides a much less useful strengthening in comparison with the cost required for executing the additional necessary operations.

sidewndr46 14 hours ago

That was pretty much my feedback at the time a well. AES-256 is more than adequate to store some secret data.

Joel_Mckay 1 day ago

libgpg will have Kyber / FIPS 203 working soon.

SPHINCS+ / FIPS 205 should be available soon.

FALCON ...unknown FIPS draft TBA soon.

These are newer quantum resistant algorithms, and should be considered in your future maintenance cycle as they become available in the libraries.

NIST has some of the brightest minds in the world. When they suggest something, than one should probably take the advice very seriously. =3

tptacek 1 day ago

My understanding is that NIST has like 2 cryptographers sitting in a closet somewhere. They're good cryptographers, but there isn't much of them.

cperciva 1 day ago

NIST is basically the publishing arm of the NSA, so it really depends on whether the NSA is taking the "protect national information assets" or the "attack foreign information assets" part of its mandate more seriously from year to year.

dfc 1 day ago

NIST does a lot of really neat work outside of crypto standards. Judah Levine and all the other metrology folks are awesome. It's unfortunate that they get grouped together by comments like this.

cperciva 1 day ago

Sorry, yes I only meant in the context of cryptography of course. NIST is a great organization and it's really a historical accident that they do anything with crypto.

Joel_Mckay 1 day ago

One will find the pool of people that deal in esoteric problems tends to be rather small in every field. =3

deknos 1 day ago

What's libgpg? i only know libgpg-error and libgcrypt.

Joel_Mckay 1 day ago

In general, the stable build usually requires:

gnupg 2.4.3

libassuan 2.5.6

libgcrypt 1.10.3

libgpgerror 1.47

libksba 1.6.5

npth 1.6

pinentry 1.2.1

However, the Kyber algorithm was only committed recently in libgcrypt 1.11.0, and will not build on some platforms due to an libassuan 3.0.1 issue.

Did you have additional details on when a working packaged set of dependencies will be available for static .a builds that support Kyber?

Have a great day =3

upofadown 1 day ago

This refers to the deprecation of 2048 bit RSA after 2030. I wrote an article attacking that policy:

* https://articles.59.ca/doku.php?id=em:20482030

The document specifies that SHA-1 in HMACs is the be entirely disallowed after 2030. That seems like it would cause needless reimplementation of systems with the associated chance of security problems and expense. SHA-1 used in an HMAC is generally known to be secure.

tptacek 1 day ago

In much the same sense that HMAC-MD5 is "secure". They deprecated all the lower-bit-strength SHA hash constructions.

The 2048 deprecation in 2030 seems to be about quantum resistance, not about a move to 4096 bit RSA.

LegionMammal978 1 day ago

> The 2048 deprecation in 2030 seems to be about quantum resistance, not about a move to 4096 bit RSA.

From [0], where the 112-bit 'security strength' of 2048-bit RSA is ultimately pulled from:

"The comparable security strengths provided below are based on accepted estimates as of the publication of this Recommendation using currently known methods. Advances in factoring algorithms, general discrete-logarithm attacks, elliptic-curve discrete-logarithm attacks, and other algorithmic advances as well as quantum computing may affect these equivalencies in the future. New or improved attacks or technologies may be developed that leave some of the current algorithms completely insecure."

Their recommendation is to switch to 3072-bit RSA or higher by 2031, since that has a 128-bit 'security strength' by their formula. So I don't think this has much to do with quantum resistance: as GP says, no reasonable RSA key size will help much with that.

[0] https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.S..., section 5.6.1

tptacek 1 day ago

I'm citing (paraphrasing) this more recent document, page 4, line 238. Let me know if I've got it wrong.

upofadown 18 hours ago

Line 244:

>Currently, a 112-bit security strength for the classical digital signature and key-establishment algorithms does not appear to be in imminent danger of becoming insecure in the near future, so this approach should allow an orderly transition to quantum-resistant algorithms without unnecessary effort for the cryptographic community.

I get from this that NIST thinks the quantum threat is significantly greater than the threat from advances in classical computing hardware or algorithms. So we are to not to bother with transitioning from 112 bit to 128 bit equivalent strength and to concentrate on post quantum stuff. As a result stuff like 2048 bit RSA is now allowed at the "deprecated" level where it was previously "disallowed" after 2030.

It seems that both the quantum and classical threats both currently depend on a fundamental breakthrough so I am not sure how legitimate this policy is. It is reminiscent of the NSA suggestion to not bother transitioning to elliptic curve based methods and skip directly to post quantum methods.

tptacek 14 hours ago

Deprecating RSA-2048 for other reasons doesn't make much sense. Whatever is going to break RSA-2048 is likely to break all of RSA. The story we're commenting on is pretty clear that the motivation here is to streamline the logistics of moving to PQ cryptography.

Credible new systems aren't going to be developed with RSA, regardless.

deknos 1 day ago

SHA-1 is around long enough, that they build precomputation tables. NSA and other state-backed organizations have the capacity to do that. The community should at least up the ante to 256 bit to make things harder.

veggieWHITES 1 day ago

We shouldn't be listening to the NIST for any sort of Cryptographic advice. [1]

[1] https://en.wikipedia.org/wiki/National_Institute_of_Standard...

gruez 1 day ago

So we should continue using ECB and RSA < 2048?

y-curious 1 day ago

Not if you want to get FedRAMP designation at any point.

archgoon 1 day ago

Ah, but that's a beauty of it. If you encrypt with ECB you can't be decrypted by a federally compliant organization!

kurikuri 3 hours ago

Unfortunately, a federally compliant organization could still decrypt it because ECB decryption is still allowed for legacy use.

User23 18 hours ago

This looks like spooks did spook stuff, got caught, and NIST fixed it? Is there evidence NIST colluded or is the NSA just good at its job?