What is the most secure hashing algorithm in the .NET framework?

前端 未结 3 1428
忘了有多久
忘了有多久 2021-01-03 13:21

The size of the generated hash and the speed of the algorithm are not important. I\'m really only interested in it being the most secure option. I don\'t want to use any thi

3条回答
  •  野趣味
    野趣味 (楼主)
    2021-01-03 13:42

    You state that the speed of the algorithm is not important, but actually it's essential.

    A lot depends on the definition of 'secure', SHA512 is (just about) impossible to reverse, but actually it's fairly easy to brute force attack it.

    This is because it is fast - you could think of it as a fundamental design flaw of the SHA 'family' in that they're designed to be very quick.

    This is a problem - SHA512 achieves it's design goal of being very fast (it's not much slower than SHA1) but if you're a hacker trying brute force passwords that makes it easier to crack. 10 or even 5 years ago a serious brute force attack would have been out of the question, now it's a couple of fancy graphics cards or some cloud time.

    This is where key-stretching algorithms come in - they make the process of building a password hash deliberately slow. Slow enough that users checking an individual hash won't notice but a brute force attack will take too long.

    A good example of a key-stretching algorithm is RFC2898 or PBKDF2 - it uses a long salt and executes an SHA algorithm thousands of times to create a hash that's slow to reproduce.

    .Net has a native implementation of this: Rfc2898DeriveBytes

    They use it for System.Web.Crypto.HashPassword, but you can easily review their source to use that elsewhere.

    On my machine now (a fairly rubbish old laptop) a single .Net Rfc2898DeriveBytes hash with 1000 iterations (the default) takes around 50ms, while I can brute force around 250,000 SHA512 hashes in a second.

    So in .Net right now the most secure option is to use Rfc2898DeriveBytes.

    However RFC2898/PBKDF2 does have a weakness - while it is slow parallel computing is getting cheaper and cheaper and it doesn't take much memory to build each hash. Right now it's pretty un-brute-forceable, but in 5 or 10 years?

    So the next generation are algorithms like bcrypt/scrypt that are designed to use a lot of memory for each hash, making parallel executions expensive. While there are .Net implementations there isn't a native one (yet) and I'd be wary of using one until there is - using these will affect loads of things like concurrent log-ons (if used for passwords) and so introduce a lot of risk for early adopters.

提交回复
热议问题