ansi2html runs out of memory with really big file
Closed this issue · 3 comments
Hello,
I need to process a really big file (8GB) with ansi2html like:
cat -u output | ansi2html -l -p > ../out.html
I understand that the input file is really large... but I was wondering if it would be possible to make ansi2html process line by line instead of trying to process the whole one... or maybe in chunks. I have 16GB of memory but, it is true that, I see that ansi2html tries to use 100% of memory just before dying with:
$ cat output | ansi2html -p ../out.html
Traceback (most recent call last):
File "/usr/lib/python-exec/python3.4/ansi2html", line 11, in
load_entry_point('ansi2html==1.2.0', 'console_scripts', 'ansi2html')()
File "/usr/lib64/python3.4/site-packages/ansi2html/converter.py", line 548, in main
output = conv.convert("".join(sys.stdin.readlines()), full=full, ensure_trailing_newline=True)
MemoryError
Thanks
Also I neither understand why it dies as soon as it eats the RAM without even waiting for SWAP (another 8Gb) to be used :/
For now I moved to https://github.com/kilobyte/colorized-logs as a workaround as it seems to be able to handle the big file properly
Out of scope.