DCAN-Labs/BIBSnet

RuntimeError: Could not find a task with the ID (540)

Closed this issue · 1 comments

I am opening an issue to re-open #108 . I am using the latest version of BIBsnet and hit the same error (details below).

Here is my command
singularity run --nv --cleanenv --no-home \
	-B /home/path/to/project/bids:/input \
	-B /home/path/to/project/derivatives/BIBSnet:/output \
	-B /home/path/to/scratch:/workdir \
	-B /home/path/to/freesurfer/license.txt:/opt/freesurfer/license.txt \
	/home/hubers2/dev/images/bibsnet_fork.sif \
	/input /output participant \
	-w workdir -participant 999 --session newborn \
Here is the stack trace
"uname": executable file not found in $PATH
Matplotlib created a temporary config/cache directory at /tmp/matplotlib-fy20gfcs because the default path (/home/hubers2/.config/matplotlib) is not a writable directory; it is highly recommended to set the MPLCONFIGDIR environment variable to a writable directory, in particular to speed up the import of Matplotlib and to better support multiprocessing.

INFO 2024-09-04 18:34:13,713: #### MAKING A DIRECTORY: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn

INFO 2024-09-04 18:34:13,713: #### MAKING A DIRECTORY: /tmp/bibsnet/bibsnet/sub-1346/ses-newborn

INFO 2024-09-04 18:34:13,714: #### MAKING A DIRECTORY: /tmp/bibsnet/postbibsnet/sub-1346/ses-newborn

INFO 2024-09-04 18:34:13,732: #### MAKING THIS DIR: /tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input

INFO 2024-09-04 18:34:13,732: #### MAKING THIS DIR: /tmp/bibsnet/bibsnet/sub-1346/ses-newborn/output

INFO 2024-09-04 18:34:13,734: Now running prebibsnet stage on:
{'subject': 'sub-1346', 'session': 'ses-newborn', 'has_T1w': True, 'has_T2w': True, 'model': 540}

INFO 2024-09-04 18:34:13,734: ### sub_ses_j_args:
 {'common': {'fsl_bin_path': '/opt/fsl-6.0.5.1/bin', 'bids_dir': '/input', 'overwrite': False, 'work_dir': '/tmp/bibsnet'}, 'bibsnet': {'model': '3d_fullres', 'nnUNet_predict_path': '/opt/conda/bin/nnUNet_predict'}, 'stage_names': {'start': 'prebibsnet', 'end': 'postbibsnet'}, 'optional_out_dirs': {'prebibsnet': '/tmp/bibsnet/prebibsnet', 'bibsnet': '/tmp/bibsnet/bibsnet', 'postbibsnet': '/tmp/bibsnet/postbibsnet', 'derivatives': '/output'}, 'ID': {'subject': 'sub-1346', 'session': 'ses-newborn', 'has_T1w': True, 'has_T2w': True, 'model': 540}, 'optimal_resized': {'T1w': '/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input/sub-1346_ses-newborn_optimal_resized_0000.nii.gz', 'T2w': '/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input/sub-1346_ses-newborn_optimal_resized_0001.nii.gz'}}
----------------------

INFO 2024-09-04 18:34:13,734: ### MAKING work_dirname: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged

INFO 2024-09-04 18:34:13,734: ### MAKING work_dirname: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped

INFO 2024-09-04 18:34:13,734: ### MAKING work_dirname: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/resized

INFO 2024-09-04 18:34:13,735: #### MAKING crop_dir: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped/T1w

INFO 2024-09-04 18:34:13,735: #### MAKING crop_dir: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped/T2w

INFO 2024-09-04 18:34:13,780: Denoising input avg image
240904-18:34:14,949 nipype.workflow INFO:
	 Workflow T1w_denoise_and_bfcorrect settings: ['check', 'execution', 'logging', 'monitoring']
240904-18:34:14,955 nipype.workflow INFO:
	 Running serially.
240904-18:34:14,955 nipype.workflow INFO:
	 [Node] Setting-up "T1w_denoise_and_bfcorrect.clip" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T1w_denoise_and_bfcorrect/clip".
240904-18:34:14,956 nipype.workflow INFO:
	 [Node] Executing "clip" <niworkflows.interfaces.nibabel.IntensityClip>
240904-18:34:26,998 nipype.workflow INFO:
	 [Node] Finished "clip", elapsed time 12.040683s.
240904-18:34:31,165 nipype.workflow INFO:
	 [Node] Setting-up "T1w_denoise_and_bfcorrect.denoise" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T1w_denoise_and_bfcorrect/denoise".
240904-18:34:31,171 nipype.workflow INFO:
	 [Node] Executing "denoise" <nipype.interfaces.ants.segmentation.DenoiseImage>
240904-18:36:17,488 nipype.workflow INFO:
	 [Node] Finished "denoise", elapsed time 106.315256s.
240904-18:36:17,490 nipype.workflow INFO:
	 [Node] Setting-up "T1w_denoise_and_bfcorrect.n4_correct" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T1w_denoise_and_bfcorrect/n4_correct".
240904-18:36:17,493 nipype.workflow INFO:
	 [Node] Executing "n4_correct" <nipype.interfaces.ants.segmentation.N4BiasFieldCorrection>
240904-18:39:51,43 nipype.workflow INFO:
	 [Node] Finished "n4_correct", elapsed time 213.549549s.
240904-18:39:51,53 nipype.workflow INFO:
	 [Node] Setting-up "T1w_denoise_and_bfcorrect.final_clip" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T1w_denoise_and_bfcorrect/final_clip".
240904-18:39:51,55 nipype.workflow INFO:
	 [Node] Executing "final_clip" <niworkflows.interfaces.nibabel.IntensityClip>
240904-18:40:00,747 nipype.workflow INFO:
	 [Node] Finished "final_clip", elapsed time 9.691259s.

INFO 2024-09-04 18:40:00,759: Denoising input avg image
240904-18:40:00,762 nipype.workflow INFO:
	 Workflow T2w_denoise_and_bfcorrect settings: ['check', 'execution', 'logging', 'monitoring']
240904-18:40:00,765 nipype.workflow INFO:
	 Running serially.
240904-18:40:00,765 nipype.workflow INFO:
	 [Node] Setting-up "T2w_denoise_and_bfcorrect.clip" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T2w_denoise_and_bfcorrect/clip".
240904-18:40:00,766 nipype.workflow INFO:
	 [Node] Executing "clip" <niworkflows.interfaces.nibabel.IntensityClip>
240904-18:40:12,126 nipype.workflow INFO:
	 [Node] Finished "clip", elapsed time 11.359511s.
240904-18:40:12,129 nipype.workflow INFO:
	 [Node] Setting-up "T2w_denoise_and_bfcorrect.denoise" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T2w_denoise_and_bfcorrect/denoise".
240904-18:40:12,131 nipype.workflow INFO:
	 [Node] Executing "denoise" <nipype.interfaces.ants.segmentation.DenoiseImage>
240904-18:42:20,557 nipype.workflow INFO:
	 [Node] Finished "denoise", elapsed time 128.424854s.
240904-18:42:20,586 nipype.workflow INFO:
	 [Node] Setting-up "T2w_denoise_and_bfcorrect.n4_correct" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T2w_denoise_and_bfcorrect/n4_correct".
240904-18:42:20,593 nipype.workflow INFO:
	 [Node] Executing "n4_correct" <nipype.interfaces.ants.segmentation.N4BiasFieldCorrection>
240904-18:45:51,150 nipype.workflow INFO:
	 [Node] Finished "n4_correct", elapsed time 210.555093s.
240904-18:45:51,160 nipype.workflow INFO:
	 [Node] Setting-up "T2w_denoise_and_bfcorrect.final_clip" in "/tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/T2w_denoise_and_bfcorrect/final_clip".
240904-18:45:51,162 nipype.workflow INFO:
	 [Node] Executing "final_clip" <niworkflows.interfaces.nibabel.IntensityClip>
240904-18:46:01,8 nipype.workflow INFO:
	 [Node] Finished "final_clip", elapsed time 9.844879s.
Configuring model on the CPU
Running SynthStrip model version 1
Input image read from: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/sub-1346_ses-newborn_0000.nii.gz
Masked image saved to: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped/T1w/skullstripped.nii.gz
Binary brain mask saved to: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped/T1w/brainmask.nii.gz
If you use SynthStrip in your analysis, please cite:
----------------------------------------------------
SynthStrip: Skull-Stripping for Any Brain Image.
A Hoopes, JS Mora, AV Dalca, B Fischl, M Hoffmann.
Configuring model on the CPU
Running SynthStrip model version 1
Input image read from: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/averaged/sub-1346_ses-newborn_0001.nii.gz
Masked image saved to: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped/T2w/skullstripped.nii.gz
Binary brain mask saved to: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/cropped/T2w/brainmask.nii.gz
If you use SynthStrip in your analysis, please cite:
----------------------------------------------------
SynthStrip: Skull-Stripping for Any Brain Image.
A Hoopes, JS Mora, AV Dalca, B Fischl, M Hoffmann.
Now cropping average image at z-coordinate plane 25
Now cropping average image at z-coordinate plane 26

INFO 2024-09-04 18:46:32,726: The anatomical images have been cropped for use in BIBSnet

INFO 2024-09-04 18:46:32,726: ### MAKING xfm_non_ACPC_vars['out_dir']: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/resized/xfms
WARNING: Both reference and input images have an sform matrix set

INFO 2024-09-04 18:49:07,561: ### MAKING xfm_ACPC_vars['out_dir']: /tmp/bibsnet/prebibsnet/sub-1346/ses-newborn/resized/ACPC_align
WARNING: Both reference and input images have an sform matrix set

INFO 2024-09-04 18:53:00,582: Using only T2w-to-T1w registration for resizing.

INFO 2024-09-04 18:53:00,584: The anatomical images have been resized for use in BIBSnet

INFO 2024-09-04 18:53:00,586: MAKING DIRECTORY out_nii_fpath_dir: /tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input

INFO 2024-09-04 18:53:00,591: MAKING DIRECTORY out_nii_fpath_dir: /tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input

INFO 2024-09-04 18:53:00,596: prebibsnet finished on subject sub-1346 session ses-newborn. Time elapsed since prebibsnet started: 0:18:46.861969

INFO 2024-09-04 18:53:00,596: Now running bibsnet stage on:
{'subject': 'sub-1346', 'session': 'ses-newborn', 'has_T1w': True, 'has_T2w': True, 'model': 540}

INFO 2024-09-04 18:53:00,596: ### sub_ses_j_args:
 {'common': {'fsl_bin_path': '/opt/fsl-6.0.5.1/bin', 'bids_dir': '/input', 'overwrite': False, 'work_dir': '/tmp/bibsnet'}, 'bibsnet': {'model': '3d_fullres', 'nnUNet_predict_path': '/opt/conda/bin/nnUNet_predict'}, 'stage_names': {'start': 'prebibsnet', 'end': 'postbibsnet'}, 'optional_out_dirs': {'prebibsnet': '/tmp/bibsnet/prebibsnet', 'bibsnet': '/tmp/bibsnet/bibsnet', 'postbibsnet': '/tmp/bibsnet/postbibsnet', 'derivatives': '/output'}, 'ID': {'subject': 'sub-1346', 'session': 'ses-newborn', 'has_T1w': True, 'has_T2w': True, 'model': 540}, 'optimal_resized': {'T1w': '/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input/sub-1346_ses-newborn_optimal_resized_0000.nii.gz', 'T2w': '/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input/sub-1346_ses-newborn_optimal_resized_0001.nii.gz'}}
----------------------

INFO 2024-09-04 18:53:00,599: ### MAKING DIRECTORIES /tmp/bibsnet/bibsnet/sub-1346/ses-newborn/output

INFO 2024-09-04 18:53:00,599: Now running BIBSnet with these parameters:
{'model': '3d_fullres', 'nnUNet': '/opt/conda/bin/nnUNet_predict', 'input': '/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input', 'output': '/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/output', 'task': 540}

/home/bibsnet/nnUNet/nnunet/paths.py:47: UserWarning: %%%%% MAKING preprocessing_output_dir: /opt/nnUNet/nnUNet_raw_data_base/nnUNet_preprocessed
  warn(f"%%%%% MAKING preprocessing_output_dir: {preprocessing_output_dir}")
/home/bibsnet/nnUNet/nnunet/paths.py:56: UserWarning: %%%% MAKING network_training_output_dir: /opt/nnUNet/nnUNet_raw_data_base/nnUNet_trained_models/nnUNet
  warn(f"%%%% MAKING network_training_output_dir: {network_training_output_dir}")
/home/bibsnet/nnUNet/nnunet/inference/predict_simple.py:159: UserWarning: ### Using task name 540
  warn(f"### Using task name {task_name}")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:24: UserWarning: ### Converting Task ID 540 to task name
  warn(f"### Converting Task ID {task_id} to task name")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:26: UserWarning: Using Startswith: Task540
  warn(f"Using Startswith: {startswith}")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:28: UserWarning: ##### preprocessing_output_dir IS NOT NONE
  warn("##### preprocessing_output_dir IS NOT NONE")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:29: UserWarning: #### preprocessing_output_dir: /opt/nnUNet/nnUNet_raw_data_base/nnUNet_preprocessed
  warn(f"#### preprocessing_output_dir: {preprocessing_output_dir}")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:30: UserWarning: ### here are the files in it: []
  warn(f"### here are the files in it: {list(Path(preprocessing_output_dir).glob('*'))}")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:37: UserWarning: #### nnUNet_raw_data IS NOT NONE
  warn("#### nnUNet_raw_data IS NOT NONE")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:38: UserWarning: #### nnUNet_raw_data: /output/nnUNet_raw_data
  warn(f"#### nnUNet_raw_data: {nnUNet_raw_data}")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:45: UserWarning: #### nnUNet_cropped_data IS NOT NONE
  warn("#### nnUNet_cropped_data IS NOT NONE")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:46: UserWarning: #### nnUNet_cropped_data: /output/nnUNet_cropped_data
  warn(f"#### nnUNet_cropped_data: {nnUNet_cropped_data}")
/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py:59: UserWarning: ### ALL CANDIDATES []
  warn(f"### ALL CANDIDATES {all_candidates}")
Traceback (most recent call last):
  File "/opt/conda/bin/nnUNet_predict", line 33, in <module>
    sys.exit(load_entry_point('nnunet', 'console_scripts', 'nnUNet_predict')())
  File "/home/bibsnet/nnUNet/nnunet/inference/predict_simple.py", line 160, in main
    task_name = convert_id_to_task_name(task_id)
  File "/home/bibsnet/nnUNet/nnunet/utilities/task_name_id_conversion.py", line 66, in convert_id_to_task_name
    raise RuntimeError("Could not find a task with the ID %d. Make sure the requested task ID exists and that "
RuntimeError: Could not find a task with the ID 540. Make sure the requested task ID exists and that nnU-Net knows where raw and preprocessed data are located (see Documentation - Installation). Here are your currently defined folders:
nnUNet_preprocessed=/opt/nnUNet/nnUNet_raw_data_base/nnUNet_preprocessed
RESULTS_FOLDER=/opt/nnUNet/nnUNet_raw_data_base/nnUNet_trained_models
nnUNet_raw_data_base=/output
If something is not right, adapt your environemnt variables.
/home/bibsnet/src/bibsnet.py:35: UserWarning: ### INSIDE run_BIBSnet. Model: 3d_fullres
  warn(f"### INSIDE run_BIBSnet. Model: {j_args['bibsnet']['model']}")

ERROR 2024-09-04 18:53:03,984: nnUNet failed to complete, exitcode 1
Error: Output segmentation file not created at the path below during nnUNet_predict run.
/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/output

For your input files at the path below, check their filenames and visually inspect them if needed.
/tmp/bibsnet/bibsnet/sub-1346/ses-newborn/input

From what I can tell, this is the issue:

  • nnUNet expects the input data to be in the container at /output/nnUNet_raw_data .
  • but bibsnet saves the input data to /tmp/bibsnet/bibsnet/sub-999/ses-newborn/input.
  • and in my case, /output/nnUNet_raw_data is empty.

FYI This "Could not find task with ID {}" error gets thrown here:

https://github.com/MIC-DKFZ/nnUNet/blob/4f2ffabe751977ee66348560c8e99102e8553195/nnunet/utilities/task_name_id_conversion.py#L50-L60

What I am trying to understand is why BIBSnet is not copying the input files at /tmp/bibsnet/bibsnet/sub-999/ses-newborn/input to /output/nnUNet_raw_data and formatting them into the structure that nnUNet expects?

Any ideas? I'm happy to help help debug this further to help others who have this issue, but I'm several hours into this and would love a hint!

xref #108

Cross referencing to a couple tickets on nnUNet that have raised this issue:

xref MIC-DKFZ/nnUNet#2405
xref MIC-DKFZ/nnUNet#2422

EDIT: see next comment for a minimally working example

FYI here is a self contained reproducible example:

Requires openneuro-py

# Python
from pathlib import Path
import openneuro

Path("./ds004776").mkdir()

openneuro.download(dataset="ds004776", target_dir="./ds004776", include="sub-01")
singularity pull bibsnet.sif docker://dcanumn/bibsnet:latest
# bash

mkdir derivatives
mkdir derivatives/BIBSnet

singularity run --nv --cleanenv --no-home \
	-B /path/to/ds004776:/input \
	-B /path/to/derivatives/BIBSnet:/output \
	/path/to/bibsnet.sif \
	/input /output participant \
	-participant 01 \