juanelas/bigint-crypto-utils

`randBetween` perfomance

Rudxain opened this issue · 2 comments

I (and most other people) appreciate that randBetween isn't biased. However, based on this answer on SO, there's a better perfomant way to get rid of the bias. It's faster, energy-efficient, and reduces the possibility of running out of entropy (because systems like linux have an "entropy pool" that might get exhausted).

However, this seems to affect small intervals more than large intervals, which means there's no need for concern since this deals with BigInts. But I still recommend it, because it will improve average performance

Thank you for the proposal!
But I'm not generating the result as a modulo. My approach is to generate the exact amount of random bits and then check that it is less or equal than the MAX value. As a result, only when the most significant bit is 1, it can happen that I have to take another random; that's a probability between 0 and .5 at most. As a result, in the worst case it's enough (in average) with just 2 tries. My initial thoughts is that setting up the other proposal (computing modular reductions) with bignumbers is likely going to have a higher cost. Someday, if a find the time, I will test it, but my intuition tells me that today.
Regarding the entropy, I wouldn't be worried, in recent Linux both /dev/random and /dev/urandom make use of the same CSPRNG, and thus they can generate random numbers for a long time without running out of entropy (just the seed). I now that this is breaking some crypto myths, but it is how it is working now. In fact, they are likely to be merged into just one cryptographically secure random source in the future, same way as BSD has already done.

Thank you for the info. I forgot that BigInt division has an overhead that could reduce performance, so (as you said) this would require benchmarks to ensure if it's a good or bad idea