File too large json.dumps
iyabchen opened this issue · 2 comments
Tried to dump an index which is more than 300+ MB, and run into
$ es2csv -q '*' -i 'metrics-2017.10.20' -o /tmp/test.csv -u http://172.18.0.2:9200
Found 679860 results
Run query [ ] [30701/679860] [ 4%] [0:00:09] [ETA: 0:03:26] [ 3.1 Kidocs/s]Traceback (most recent call last):
File "/usr/local/bin/es2csv", line 11, in
sys.exit(main())
File "/usr/local/lib/python2.7/dist-packages/es2csv.py", line 283, in main
es.search_query()
File "/usr/local/lib/python2.7/dist-packages/es2csv.py", line 40, in f_retry
return f(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/es2csv.py", line 170, in search_query
self.flush_to_file(hit_list)
File "/usr/local/lib/python2.7/dist-packages/es2csv.py", line 212, in flush_to_file
tmp_file.write('%s\n' % json.dumps(out))
IOError: [Errno 27] File too large
Hello, what filesystem are you using? This error is weird. I have been worked with indexes which is more 1 GB size with out any issues.
Cannot reproduce. maybe due to lack of space in the hard drive. Thanks for answering.