Memory and storage requirements?
andres-erbsen opened this issue · 1 comments
Do you have a good sense of how much RAM and disk CryptOpt should use? I haphazardly attempted a week-long CryptOpt run and saw it reach memory exhaustion on day 2. Here's an output from the rare instance that got killed by node instead of Linux:
fiat_curve25519_solinas_square| run| 6|bs 156|#inst: 103|cycl_ 12|G 26 cycl _ 0|B 26 cycl _ 0|L 64|l/g 2.5098| P|P[ -1/ 0/ 1/ -1]|D[FL/ 48/ 13/ 5
7]|47.2M(43%) 115/s
<--- Last few GCs --->
[212:0x67a1c50] 513566914 ms: Scavenge 3885.3 (4129.6) -> 3878.9 (4129.6) MB, 3.2 / 0.0 ms (average mu = 0.629, current mu = 0.642) task;
[212:0x67a1c50] 513566933 ms: Scavenge 3886.7 (4129.6) -> 3879.1 (4129.6) MB, 3.4 / 0.0 ms (average mu = 0.629, current mu = 0.642) allocation failure;
[212:0x67a1c50] 513566952 ms: Scavenge 3886.0 (4129.6) -> 3879.2 (4145.6) MB, 3.7 / 0.0 ms (average mu = 0.629, current mu = 0.642) task;
Does 4GB/process look like expected memory usage to you? I could provision that much, but I can't think of a reason why it would be needed.
JS stacktrace:
I believe here is the invocation I used:
for i in $(seq 0 "$(("$(nproc)"-1))"); do
tmux new-window -n "c$i"
tmux send-keys "taskset -c $i ~/CryptOpt/CryptOpt --no-proof --resultDir /mnt/results --curve curve25519_solinas --method square --framePointer save --evals $((175*60*60*24*9))" C-m
sleep "0.$RANDOM"
done
Additionally, the results directory seems to have acquired 49GB of csv files (and some asm and json). Are these something I'd may want to look at, or perhaps I should not be collecting them at all?
the memory usage solely comes from keeping track of the ratios along the optimisation. Those are then put into the csv files, too (hence they are so big). Those are used to generate the plot of how the optimisation ran after. Seems like you've given each process 100M evaluations. (I've never done so many, max was around 10M I believe.) However, opitimisation depends heavily on Curve and machine. Maybe check the generated pdf's on how the optimisation went, and then see if it feels that there is much more to improve after say 1M/5M evals. Curve25519 is not too big of a curve, and there is not too much to explore (as compared to 448, or 521 for instance).
I could also see not keeping track of the ratios, which should keep the memory/storage down.