Running LongQC always returns Killed error
Opened this issue · 1 comments
Hi, thanks for creating LongQC. I am trying it on PacBio subreads BAM file which has 160 GB, but I am always getting Killed error. Here is my command:
/usr/src/Python-3.10.0/python /opt/LongQC/longQC.py sampleqc --output sample_longQC --preset pb-sequel --index 200M --sample_name sample --ncpu 10 --mem 1 m64065_200925_092426.subreads.bam
I tried different values for CPU, mem and index but always getting Killed error. I suppose main culprit is creation of fastq files in Analysis folder. Is there a way for that to be omitted, since I am only using figures, not fastq file. Version of software is LongQC 1.2.0c.
Hi @looxon93,
Thanks for your interests in our app and valuable feedback.
Indeed, managing subreads.bam is much tougher than HiFi bam in many aspects. fastq (or fasta) is the only acceptable format for some internal calculation, hence the conversion from bam to fastq. The figures technically generated from those fastq files, not from bam or any other files. I'm not sure the cause of Killing, but if it's caused by space issue, then skipping this conversion is actually not trivial for LongQC, even for figures.
Also, in subread case, memory requirement can be quite high. In default (4G), in some cases, >100G bytes memory could be needed, this is particularly common for reads having a skewed GC composition. You've already reduced it to 200M (0.2G), but please make sure your env has enough memory for even 0.2G index size as OOM error can kill the process either. 16GB memory would be ok for 0.2G index size (--index option).
Yoshinori