RustCrypto/traits

digest: add `DynDigest::max_input_size`

Bl4omArchie opened this issue · 7 comments

Recently I've opened a ticket in RustCrypto's RSA repo: RustCrypto/RSA#434

My request is to add a new method in digest::DynDigest, so we can retrieve the maximum input size for each hash function.

 max_input_size(&self) -> Option<u64>

For now I'm looking at the code so I can understand how to detect which hash is used.

@Bl4omArchie the idea would be that each digest could specify this, so you don't have to "detect" which digest is being used.

A tricky part here though is DynDigest is mainly used via a blanket impl. So while this method could be easily added, even in a non-breaking manner by having it return None by default, the harder part is making it possible to actually return a value via the blanket impl.

That might require the specification of additional traits, e.g. something like MaxInputSizeUser, and changing the bounds of the blanket impl to include such a trait.

Most hash functions have limits which are simply not achievable in practice. For example, let's consider the 2^64 bits limit for SHA-256. With hardware-accelerated SHA-NI you get performance equal to ~2 cycles per byte, to overflow 64-bit bit counter on a 4.3 GHz CPU (2^32 Hz) you will need 2^30 seconds (2 * 2^61/2^32) or 34 years of continuous computations.

So I consider this feature to have the same importance for us as support of bit-level updates.

This issue was started for clearing a #todo in the rsa craft. So if you're judging that it doesn't matter to mind this verification, I understand. Because yes 34 years of computing seem far away from real life scenarios.
But still I wonder, could it not have an impact in the near future ? Like in anticipation of new improvements ?

could it not have an impact in the near future

I highly doubt that you can be an order of magnitude faster than SHA-NI with sequential SHA-256 computation on conventional CPUs.

@tarcieri
WDYT? It looks like adding an associated constant to the Update trait with None default would be easy enough (and subsequently Digest::max_input_size), but, as I wrote above, I am not sure about practical importance of exposing this information. Currently, we have no overflow checks for block/byte/bit counters and I don't think it's worth to change it by introducing panics or additional try_* methods.

I think the concerns in RustCrypto/RSA#434 are largely historical since newer hash functions than SHA-1 have significantly higher input maximums to the point they're not really practical to encounter.

I'm fine with the current situation in that issue, which is merely a hardcoded maximum, especially as the limit in that case is for OAEP labels which are typically rather short strings.

SHA-1 and SHA-256 have the same limit (2^64 bits), so I think that it's similarly not possible to hit the limit in practice. I agree that it's fine to limit input size by 2^64 bits for all hash functions. Even if in theory SHA-512 and SHA-3 can process longer inputs, it should be impossible to encounter such sizes in practice on one machine.