Anyway to reduce the memory use?
Closed this issue · 5 comments
For some binary with large size(like 400MB), it will OOM killed by os. My host have 180GB Memory....
I think there are many tools focused on this, or just use builtin pprof.
Mostly the running memory usage are not related to the binary size.
Hm, I read OP's message in a different way. Running gsa
on a 400Mb binary will result in gsa
process get killed due to OOM.
@joway can you confirm that?
Oh, I understand what you means now.
Are there any samples you can provide? I'm analysing about 200 megabytes of k8s binary and it's working fine.
One reason I can imagine is that the decompile creates too many go routine in a split second, consuming all the available memory, but I don't think it's likely to squeeze a machine with 180g of RAM.
I'll try to put a limit on it and please let me know if this fixes your problem.
A configurable gc parameter was added in 1.0.5