connectomicslab/connectomemapper3

BUG: Error in anatomical pipeline when using custom parcellation on BIDS dataset with no session

MeropGont opened this issue · 8 comments

Hi,

I want to use my custom parcellation scheme. Following the instructions on CMP Docs, first I ran the anatomical pipeline with the default config and got full results.
Then I added my custom parcellation .nii.gz and .tsv files (following BIDS spec as described in the Docs) and created a new config file accordingly. Apparently the segmentation settings automatically changes to custom too if you want to use custom parcellation right? So I adjusted that as well using previously generated desc and label files, and then I ran the pipeline.
CMP3 could find all the files correctly and the anatomical pipeline starts, but stops after a minute and I get this error :
File "/opt/conda/envs/py37cmp-core/lib/python3.7/genericpath.py", line 153, in _check_arg_types (funcname, s.__class__.__name__)) from None TypeError: join() argument must be str or bytes, not 'NoneType'

I'm not sure if it's a bug or I configured it wrongly, any help would be appreciated.

OS: Ubuntu 20.04.4 LTS
CMP3 v3.0.3

The log file is attached.
sub-01_log.txt

Also my custom parcellation files are attached.
sub-01_atlas-AAL_res-scale1_dseg.nii.gz
sub-01_atlas-AAL_res-scale1_dseg.zip

Thanks!

Hi @MeropGont,
Please could you share the configuration file you used for the anatomical pipeline?

Hi @sebastientourbier ,

Sure, you can find it attached.
check_parcellation_rev03.zip

Thank you very much!

Everything seems to be well configurated on your side and this sounds like you found a bug.

I looked in more details at the execution, and it seems to error appears right after the run of the workflow at the call of ._update_parcellation_scheme():

self._update_parcellation_scheme()

Could you confirm me you get the files sub-01_atlas-AAL_res-scale1_dseg.graphml and sub-01_atlas-AAL_res-scale1_stats.tsv generated?

Now, by looking at the rest of the trace of the error, I found the cause of the issue in cmtklib/bids/io.py (a missing condition to test if the dataset has a subject/session structure line 143):

fname = f'{subject}' if session is None else f'{subject}_{session}'
filepath = os.path.join(filepath, session)

In your case (Subject only structure), session is None and this function fails.

In the meantime this is fixed and released in a new version, that I expect to be in ~ 1-2 weeks, you could make a copy of io.py on your machine and modify line 143 as follows:

if session is not None:
    filepath = os.path.join(filepath, session)

and then running the BIDS App by calling the docker run with the additional argument -v /path/to/your/modified/io.py:/opt/conda/envs/py37cmp-core/lib/python3.7/site-packages/cmtklib/bids/io.py. This will replace the io.py located in the docker image container by your version.

Tell me if this works.

Also, how did you run the BIDS App? If you ran via the GUI, the full "docker run ..." command executed is displayed as output in the terminal. Happy to help if you are not familiar with Docker.

Hi,

many thanks for your instructions! It makes sense.

Yes sub-01_atlas-AAL_res-scale1_dseg.graphml and sub-01_atlas-AAL_res-scale1_stats.tsv were indeed generated.

The solution with correcting the io.py seems plausible and I could figure out more or less how to work with docker instead of the GUI. But I'm having a simple problem finding the right io.py file! I changed my io.py in this directory: /home/usr/miniconda3/env/... , but it didn't work. Apparently I need to correct it in the /opt/ directory right? but the problem is that I don't find any conda files in my /opt! Could you please help me with this?

Best,
Fatemeh

Hi @MeropGont,

In fact, you should not really care about the io.py file that has been installed with the GUI.

For a sake of clarity and reproducibility, I would advise you to download the io.py (as it is on GitHub for v3.0.3) directly from github at the following link: https://github.com/connectomicslab/connectomemapper3/raw/af079cffabfad9dd1aba287ff7be1af4fc346f95/cmtklib/bids/io.py, and place this file inside the code/ folder of your BIDS dataset. This file will be the one you could modify and use in the BIDS App with the "mounting" trick, which corresponds to adding the extra -v option:

docker run -v /local/path//to/bids_dataset/code/io.py:/opt/conda/envs/py37cmp-core/lib/python3.7/site-packages/cmtklib/bids/io.py [...]

and here, /opt/conda/envs/py37cmp-core/lib/python3.7/site-packages/cmtklib/bids/io.py corresponds indeed of the location of this file in the BIDS App (container) and not on your machine. So, it is normal that you could not find conda files there 👍

Hi @sebastientourbier ,

I did as you said, downloaded the io.py file from the GitHub Repo and copied it into my code/ folder and added the replacement argument to the docker command. But it doesn't seem to work, I also searched and corrected every io.py file that I had on my local CMP3 directory, but every time I still get the same error.

Here's my docker command:

(py37cmp-gui) fateme@fateme-HP:~$ docker run -v /home/fateme/Desktop/BIDS/code/io.py:/opt/conda/envs/py37cmp-core/lib/python3.7/site-packages/cmtklib/bids/io.py -v /home/fateme/Desktop/BIDS:/bids_dir -v /home/fateme/Desktop/BIDS/derivatives:/output_dir -v /usr/local/freesurfer/license.txt:/bids_dir/code/license.txt -v /home/fateme/Desktop/BIDS/code/check_parcellation_rev03.json:/code/ref_anatomical_config.json -u 1000:1000 sebastientourbier/connectomemapper-bidsapp:v3.0.3 /bids_dir /output_dir participant --participant_label 01 --anat_pipeline_config /code/ref_anatomical_config.json --fs_license /bids_dir/code/license.txt --number_of_participants_processed_in_parallel 1 --number_of_threads 1
Terminal output:

> BIDS dataset: /bids_dir
> Subjects to analyze : ['01']
> Set $FS_LICENSE which points to FreeSurfer license location (BIDS App)
  .. INFO: $FS_LICENSE set to /bids_dir/code/license.txt
  * Number of subjects to be processed in parallel set to 1 (Total of cores available: 7)
  * Number of parallel threads set to 1 (total of cores: 7)
  .. INFO: Report execution to Google Analytics. 
Thanks to support us in the task of finding new funds for CMP3 development!
> Process subject sub-01
  .. WARNING: rewriting config file /output_dir/cmp-v3.0.3/sub-01/sub-01_anatomical_config.json
	 ... Anatomical config created : /output_dir/cmp-v3.0.3/sub-01/sub-01_anatomical_config.json
  .. INFO: Running pipelines : 
		- Anatomical MRI (segmentation and parcellation)
  .. INFO: diffusion pipeline not performed
  .. INFO: functional pipeline not performed
... cmd : connectomemapper3 --bids_dir /bids_dir --output_dir /output_dir --participant_label sub-01 --anat_pipeline_config /output_dir/cmp-v3.0.3/sub-01/sub-01_anatomical_config.json --number_of_threads 1

And the log file is attached.
sub-01_log.txt
file is attached.

Hi @MeropGont !

This really strange... By looking at your command, it seems the additional -v is correct and corresponds to the exactly the path is the log. Based on https://docs.docker.com/storage/bind-mounts/#mount-into-a-non-empty-directory-on-the-container, we should expect the following:

If you bind-mount into a non-empty directory on the container, the directory’s existing contents are obscured by the bind mount.

You mentioned that you downloaded and copied io.py inside the code/ folder but, did you modify the line 143 as the following:

if session is not None:
    filepath = os.path.join(filepath, session)

?

Hi @sebastientourbier ,

yes you're right, sorry, I had forgotten to modify the io.py file this time.
OK I think it's working now!
The log: sub-01_log.txt

Thanks!