nipreps/niworkflows

mriqc not installed as dependency

Closed this issue · 4 comments

What happened?

After running $ python -m pip install sdcflows and running sdcflows got the following error

Traceback (most recent call last):
File "/oak/stanford/groups/russpold/users/sjshim/miniconda3/envs/fmri_analysis/bin/sdcflows", line 8, in
sys.exit(main())
^^^^^^
File "/oak/stanford/groups/russpold/users/sjshim/miniconda3/envs/fmri_analysis/lib/python3.11/site-packages/sdcflows/cli/main.py", line 178, in main
"plugin": MultiProcPlugin(
^^^^^^^^^^^^^^^^
File "/oak/stanford/groups/russpold/users/sjshim/miniconda3/envs/fmri_analysis/lib/python3.11/site-packages/niworkflows/engine/plugin.py", line 416, in init
from mriqc import config
ModuleNotFoundError: No module named 'mriqc'

It was resolved once I pip installed mriqc

What command did you use?

sdcflows BIDS sdcflow/ participant

What version of the software are you running?

2.9.0

How are you running this software?

Local installation ("bare-metal")

Is your data BIDS valid?

Yes

Are you reusing any previously computed results?

No

Please copy and paste any relevant log output.

No response

Additional information / screenshots

No response

Seems like local installation might not be best in general? I'm trying the installation again in a fresh conda environment to double check (on HPC). But if docker/singularity is the recommended installation method for working on HPC given restrictions for the external dependencies, it might be worth adding to documentations.

mgxd commented

Yeah, we should not rely on MRIQC for this to work - whatever config options are being used can probably just be initialized as class attributes.

How are you running this software?

Local installation ("bare-metal")

Seems like local installation might not be best in general? I'm trying the installation again in a fresh conda environment to double check (on HPC). But if docker/singularity is the recommended installation method for working on HPC given restrictions for the external dependencies, it might be worth adding to documentations.

mriqc isn't installed in the Docker image either.

My current workaround for a container is to

  1. Bind a local, writeable directory for installing Python packages to a container (I use the environment variable $PYTHON_USERBASE)
  2. Install mriqc, then uninstall pyaml and reinstall pyyaml<5.4 (mriqc installs a newer version with breaking changes for this package) to that bound writeable directory like
    pip install --target="${PYTHON_USERBASE}" mriqc
    pip uninstall -y pyyaml
    pip install --target="${PYTHON_USERBASE}" "pyyaml<5.4"
  3. When I run, bind that path again and use a shell script like
    #!/bin/bash
    export PYTHONPATH="${PYTHON_USERBASE}"
    sdcflows "${BIDS_DIR}" "${OUTPUT_DIR}" participant

to make my local copies of mriqc and pyyaml findable in the container.

Okay, this is actually a niworkflows bug, but it's affecting sdcflows.

I guess we need to release niworkflows 1.10.3 and sdcflows 2.9.1, depending on this?