genomicsITER/NanoCLUST

medaka consensus error

tarah28 opened this issue · 1 comments

Hello! Hoping you can please help with this. I am just running the test data but can't figure out how to get medaka consensus working for the pipeline, can you please help? Thanks so much.

We are unable to use docker here so need to use conda.

COMMAND: nextflow run main.nf -profile test,conda

[d5/b19fa4] process > QC (1) [100%] 1 of 1 ✔
[00/bba3f0] process > fastqc (1) [100%] 1 of 1 ✔
[47/ef83da] process > kmer_freqs (1) [100%] 1 of 1 ✔
[60/c14484] process > read_clustering (1) [100%] 1 of 1 ✔
[61/562b4b] process > split_by_cluster (1) [100%] 1 of 1 ✔
[bc/107089] process > read_correction (1) [100%] 8 of 8 ✔
[d2/17da57] process > draft_selection (6) [ 75%] 6 of 8
[c4/6c6240] process > racon_pass (6) [100%] 6 of 6
[24/5ee26e] process > medaka_pass (1) [ 16%] 1 of 6, failed: 1
[- ] process > consensus_classification -
[- ] process > join_results -
[- ] process > get_abundances -
[- ] process > plot_abundances -
[97/496983] process > output_documentation [100%] 1 of 1 ✔
WARN: Sample mock4_run3bc08_5000 : Racon correction for cluster 2 failed due to not enough overlaps. Taking draft read as consensus
WARN: Sample mock4_run3bc08_5000 : Racon correction for cluster 5 failed due to not enough overlaps. Taking draft read as consensus
WARN: Sample mock4_run3bc08_5000 : Racon correction for cluster 1 failed due to not enough overlaps. Taking draft read as consensus
WARN: Sample mock4_run3bc08_5000 : Racon correction for cluster 4 failed due to not enough overlaps. Taking draft read as consensus
WARN: Sample mock4_run3bc08_5000 : Racon correction for cluster 7 failed due to not enough overlaps. Taking draft read as consensus
WARN: Sample mock4_run3bc08_5000 : Racon correction for cluster 6 failed due to not enough overlaps. Taking draft read as consensus
Error executing process > 'medaka_pass (1)'

Caused by:
Process medaka_pass (1) terminated with an error exit status (1)

Command executed:

if medaka_consensus -i corrected_reads.correctedReads.fasta -d racon_consensus.fasta -o consensus_medaka.fasta -t 4 -m r941_min_hi gh_g303 ; then
echo "Command succeeded"
else
cat racon_consensus.fasta > consensus_medaka.fasta
fi

Command exit status:
1

Command output:
Checking program versions
This is medaka 1.2.2
Program Version Required Pass
bcftools 1.11 1.9 True
bgzip 1.11 1.9 True
minimap2 2.17 2.11 True
samtools 1.11 1.9 True
tabix 1.11 1.9 True
Aligning basecalls to draft
Removing previous index file /home/tlynch/programs/NanoCLUST/work/5d/a447253a90b3dad02af6df1295581a/racon_consensus.fasta.mmi
Removing previous index file /home/tlynch/programs/NanoCLUST/work/5d/a447253a90b3dad02af6df1295581a/racon_consensus.fasta.fai
Running medaka consensus
Failed to run medaka consensus.

Command error:
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/engine/sequential.py", line 208, in add
layer(x)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/wrappers.py", line 5 39, in call
return super(Bidirectional, self).call(inputs, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py", line 952, in call
input_list)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py", line 1091, in _functional_construction_call
inputs, input_masks, args, kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py", line 822, in _keras_tensor_symbolic_call
return self._infer_output_signature(inputs, args, kwargs, input_masks)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py", line 863, in _infer_output_signature
outputs = call_fn(inputs, *args, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/wrappers.py", line 6 53, in call
initial_state=forward_state, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 660, in call
return super(RNN, self).call(inputs, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py", line 1012, in call
outputs = call_fn(inputs, *args, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent_v2.py", li ne 439, in call
inputs, initial_state, _ = self._process_inputs(inputs, initial_state, None)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 859, in _process_inputs
initial_state = self.get_initial_state(inputs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 643, in get_initial_state
inputs=None, batch_size=batch_size, dtype=dtype)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 1948, in get_initial_state
return _generate_zero_filled_state_for_cell(self, inputs, batch_size, dtype)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 2987, in _generate_zero_filled_state_for_cell
return _generate_zero_filled_state(batch_size, cell.state_size, dtype)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 3005, in _generate_zero_filled_state
return create_zeros(state_size)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/keras/layers/recurrent.py", line 3000, in create_zeros
return array_ops.zeros(init_state_size, dtype=dtype)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/util/dispatch.py", line 201, in w rapper
return target(*args, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/ops/array_ops.py", line 2819, in wrapped
tensor = fun(*args, **kwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/ops/array_ops.py", line 2868, in zeros
output = _constant_if_small(zero, shape, dtype, name)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/ops/array_ops.py", line 2804, in _constant_if_small
if np.prod(shape) < 1000:
File "<array_function internals>", line 6, in prod
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/numpy/core/fromnumeric.py", line 3052, in prod
keepdims=keepdims, initial=initial, where=where)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/numpy/core/fromnumeric.py", line 86, in wrapreduct ion
return ufunc.reduce(obj, axis, dtype, out, **passkwargs)
File "/home/tlynch/miniconda3/envs/nextflow22.10/lib/python3.7/site-packages/tensorflow/python/framework/ops.py", line 855, in _ array

" a NumPy call, which is not supported".format(self.name))
NotImplementedError: Cannot convert a symbolic Tensor (bidirectional/forward_gru1/strided_slice:0) to a numpy array. This error ma y indicate that you're trying to pass a Tensor to a NumPy call, which is not supported
Failed to run medaka consensus.
.command.sh: line 5: consensus_medaka.fasta: Is a directory

Work dir:
/home/tlynch/programs/NanoCLUST/work/24/5ee26e068dad1554269abe4696aeb7

Tip: you can try to figure out what's wrong by changing to the process work dir and showing the script file named .command.sh

[nf-core/nanoclust] Pipeline completed with errors
WARN: Killing running tasks (3)

Hi!

I ran into the same issue and fixed it by changing a line in main.nf.
I changed line 408:
cat $draft > consensus_medaka.fasta
To:
cat $draft > consensus_medaka.fasta/consensus.fasta
Otherwise, it ran into issues when medaka consensus failed.
I hope this works for you as well!