Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The gist of this Auth0 authentication API bypass is detailed as follows:

> The Authentication API prevented the use of alg: none with a case sensitive filter. This means that simply capitalising any letter e.g. alg: nonE, allowed tokens to be forged.

I really don't know what to think of why you need a case-sensitive filter for alg:'none'. The question is that why use and support 'alg:none' in the standard in the first place? As I previously commented, the option to have 'alg: none' should never be used as it is still the biggest footgun in the JOSE specification. Even giving the user a choice of ciphers to use is a recipe for disaster. Thus, JWT is still a cryptographically weak standard and its use is discouraged by many cryptographers.

PASETO [0] or Branca [1] are cryptographically stronger alternatives to use over JWT here.

[0] https://paseto.io

[1] https://branca.io



> the option to have 'alg: none' should never be used

I doubt anyone uses this deliberately (edit: except maybe for internal server to server communications?). I agree that having it as an option is a footgun. I still think this is a non-issue on the client/backend, most libraries explicitly make you whitelist token signing algorithms and will throw errors if the token isn't signed with the right algorithm.

> Even giving the user a choice of ciphers to use is a recipe for disaster.

How so? I'm still learning this stuff, so I'm genuinely curious.


> How so? I'm still learning this stuff, so I'm genuinely curious.

It is the same reason why the author of Wireguard rejected cryptographic agility in its use of protocols and ciphers:

From the Wireguard paper [0]:

> 'Finally, WireGuard is cryptographically opinionated. It intentionally lacks cipher and protocol agility. If holes are found in the underlying primitives, all endpoints will be required to update. As shown by the continuing torrent of SSL/TLS vulnerabilities, cipher agility increases complexity monumentally.'

[0] https://www.wireguard.com/papers/wireguard.pdf


> > Even giving the user a choice of ciphers to use is a recipe for disaster.

> How so? I'm still learning this stuff, so I'm genuinely curious.

https://paragonie.com/blog/2019/10/against-agility-in-crypto... :)


I really hope you're planning some agility in PASETO otherwise it's de-facto s* protocol that will have to be thrown away within a few years upon the first cryptographic weakness, breaking all applications that dared to adopt it.

Fact is, ciphers and protocols evolve over time. In the real world of client-servers (often many clients and many servers), it's not possible to magically upgrade all systems at once to exclusively accept a single same cipher. There's got to be a way to phase-in ciphers gradually across systems and phase-off. Agility is simply a real world constraint to be able to operate software in the real world.


> I really hope you're planning some agility in PASETO otherwise it's de-facto s* protocol that will have to be thrown away within a few years upon the first cryptographic weakness, breaking all applications that dared to adopt it.

Instead of cipher agility, PASETO uses versioned protocols.

My DEFCON Crypto & Privacy Village talk (slides and YouTube video at https://paseto.io for the curious) covered this distinction in detail.


It’s versioned, which is an improvement on “agility”


Protocols agility allow applications to pick between multiple settings, let's say RSA128 and RSA256 for example. This allows to add and remove ciphers over time, which is very important.

In theory it's a bad idea, because it means stuff might select obsolete ciphers during operation, which is bad.

In practice, there is no choice but to design agility. Ciphers will invariably get weak after some years (computer get faster) so they need to be phased off and replaced by newer ones.

In the real world, there are meshes of client-server interacting with one another. You can't just upgrade the software on one side to only use the newer cipher, or nothing could connect to it anymore. Thus there has to be the capability to work with multiple ciphers, so older ciphers can gradually be phased-in across systems and older cipher phased-off.

Pretty sure the two other commenters are mostly researchers with no real world software deployment to manage. Otherwise they wouldn't be so strong against agility. Fact is a system with no agility is dead in the water because it can't evolve.


This is exactly backwards.

In theory cipher agility is useful so you can upgrade your suite over time, in practice it’s a terrible idea because it will, and has time and again, lead to downgrade attacks.

Crypto researchers have learned this fact of life the hard way, crypto systems are becoming less “agile” over time because this agility means it’s preemptively broken.


> The question is that why use and support 'alg:none' in the standard in the first place?

FTFA:

> The JWT standard supports insecure JWT algorithms for scenarios where encryption and a signature are not suitable, such as trusted server-to-server communication. In these scenarios, the none algorithm is specified in the JWT header. The none alg type should never be used for untrusted user-supplied tokens.


It's funny that they say it's "not suitable" when it's really just pure laziness. It takes two seconds to create a secret key and use the HS256 algorithm to generate and verify a signature.


Creating a key is easy; storing it securely and giving access to only the parts of your system that need it takes a bit more work.

Of course, if you trust your network and the parties involved well enough that you'd be fine with unencrypted and unauthenticated data, I guess it doesn't matter if you just check the key into a git repo somewhere... but then you're potentially normalizing bad practices, even if in that particular instance it might be ok.


"trusted server"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: