jcreinhold/intensity-normalization

Segmentation fault (core dumped)

zzz123xyz opened this issue ยท 9 comments

๐Ÿ› Bug

Hi Jacob
I follow your 5min_tutorial (https://github.com/jcreinhold/intensity-normalization/blob/master/tutorials/5min_tutorial.md) . ./create_env.sh to create a virtual environment for the intensity-normalization. when I want to simply process some samples. I run into the error of Segmentation fault. I don't know how does this happen. I can run the tool while ago. It was successful. I have try on my both servers:

in one server, it shows like this:
intensity_normalization.utilities.preprocess - INFO - Preprocessing image: Sample_T1 (1/1)
Segmentation fault (core dumped)

in another server, it shows like this:
intensity_normalization.utilities.preprocess - INFO - Preprocessing image: Sample_T1 (1/1)
Segmentation fault

The strange thing is that I try with my server 2 where I install the package and run successfully. This time it falls with the same error.

Please help. Thanks

To Reproduce

Steps to reproduce the behavior:

  1. install the package using . ./create_env.sh
  2. error: Segmentation fault (core dumped)

Environment

  • intensity-normalization version (1.4.3):
  • numpy version (1.16.5):
  • OS (Linux):
  • How you installed intensity-normalization (conda, . ./create_env.sh):
  • Python version: 3.7.7

Hi @zzz123xyz what is the exact command you ran to generate this error?

Please provide that information, and then run the command with the option -vv and provide the full output generated (including the line which contains the entered command).

Hi @zzz123xyz what is the exact command you ran to generate this error?

Hi Jacob

Thanks for fast reply.

here is my code I run on my server the moment (the installation worked a while ago):

conda info --envs
conda activate intensity_normalization

IMG_DIR=<my path>/IXI_samples_test/test_3_samples_preprocess
MASK_DIR=<my path>/IXI_samples_test/test_3_samples_masks
PREP_DIR=<my path>/IXI_samples_test/test_3_samples_preprocess_output
preprocess -i $IMG_DIR -m $MASK_DIR -o $PREP_DIR -r 1 1 1 -vv

here are the outputs:

2020-07-29 23:27:40,687 - intensity_normalization.utilities.preprocess - DEBUG - N4 Options are: {'iters': [200, 200, 200, 200], 'tol': 0.0005}
2020-07-29 23:27:40,697 - intensity_normalization.utilities.preprocess - INFO - Preprocessing image: IXI002-Guys-0828-T1 (1/3)
Segmentation fault (core dumped)

Please let me know if you want more information, Thanks so much

Can you provide some information about what the underlying images are? Are they 3D? It looks like you are using IXI, but did you do any other preprocessing to the images before running them through this script? (I assume there are 3 .nii images in both of IMG_DIR and MASK_DIR, correct?)

Can you cd to the intensity_normalization directory (where you ran . ./create_env.sh) and run nosetests -v tests/ and provide the output from that command?

Are the masks the same size as the corresponding images? That is, if the first image in IMG_DIR is 128x128x128, is the corresponding mask in MASK_DIR 128x128x128? Make sure there are no differences in dimension, and make sure there are no empty dimensions (e.g., in the previous scenario, the mask has dimension 1x128x128x128).

Are the masks the same size as the corresponding images? That is, if the first image in IMG_DIR is 128x128x128, is the corresponding mask in MASK_DIR 128x128x128? Make sure there are no differences in dimension, and make sure there are no empty dimensions (e.g., in the previous scenario, the mask has dimension 1x128x128x128).

Hi Jacob
Yeah, I am using the IXI data to test this, they are all 3D, I did not use other preprocessing. there are 3 test samples in both IMG_DIR and MASK_DIR
the result of run nosetests -v tests/ is as follow:

    ~/nas_home/intensity-normalization$ nosetests -v tests/

----------------------------------------------------------------------
Ran 0 tests in 0.007s

OK

The dimensions from the IXI sample and the brain mask from the brainsuite are as follow.

header_info
header_info2

I found the dimension is changed in the brain mask from brainsuite. It is different from the original samples. But I still try it. the results are the same segmentation fault. 
Then, I change the dimension of the sample into the MNI space to match the brain mask from the brainsuite, the new dimension comparisons are as follow:

header_info3
header_info4

I try with the new samples, it still fails and shows segmentation fault (core dumped).

I try to debug the source code of preprocess.py in intensity_normalization/utilities:

import ants
import os
import logging
from intensity_normalization.utilities.io import split_filename, glob_nii
from intensity_normalization.utilities.preprocess import preprocess

img_dir = '/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni'
mask_dir = '/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_masks'
out_dir = '/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_output'
preprocess(img_dir, mask_dir, out_dir)
print('done')

The preprocess.py fails in Line 81: img = ants.n4_bias_field_correction(img, convergence=n4_opts) , the results are show as:
Process finished with exit code 139, I look it up online. it says it coud some memory errors.
Could that be the ants version problem?

Thanks

The nosetests didn't run. You must not have been inside the intensity_normalization directory. (Inside meaning when you run ls by itself you see the create_env.sh script.) Can you please try to do that again? It should run 30 tests.

Thanks for the extra info. FWIW, I wasn't able to reproduce this error, although I've encountered it before a while ago. It is probably due to a bad install of antspy. You may have to download the antspy directory and run python setup.py install (while in the intensity_normalization conda env) to build it from source.

Just FYI. The preprocess script really does very little. It was meant to be a simple convenience script for very basic pre-processing. If you have ANTs installed or something else to do bias field correction, you can (and should) just use that.

Let me know the output of the nosetests when you get the chance. Also, please run:

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img, convergence={'iters': [200, 200, 200, 200], 'tol': 0.0005})

where you fill in FILLTHISIN.nii.gz with a valid filename inside the test_3_samples_preprocess_in_mni directory. Also try:

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img)

To make sure the convergence criteria aren't bad (for some reason).

These tests will tell us if my script is doing something wrong or if the problem is in antspy.

The nosetests didn't run. You must not have been inside the intensity_normalization directory. (Inside meaning when you run ls by itself you see the create_env.sh script.) Can you please try to do that again? It should run 30 tests.

Thanks for the extra info. FWIW, I wasn't able to reproduce this error, although I've encountered it before a while ago. It is probably due to a bad install of antspy. You may have to download the antspy directory and run python setup.py install (while in the intensity_normalization conda env) to build it from source.

Just FYI. The preprocess script really does very little. It was meant to be a simple convenience script for very basic pre-processing. If you have ANTs installed or something else to do bias field correction, you can (and should) just use that.

Let me know the output of the nosetests when you get the chance. Also, please run:

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img, convergence={'iters': [200, 200, 200, 200], 'tol': 0.0005})

where you fill in FILLTHISIN.nii.gz with a valid filename inside the test_3_samples_preprocess_in_mni directory. Also try:

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img)

To make sure the convergence criteria aren't bad (for some reason).

These tests will tell us if my script is doing something wrong or if the problem is in antspy.

Thanks, Jocab
I use the ls, it is really the installation path

:~/nas_home/intensity-normalization$ ls
antspy                                     coregister.sh       docs                              preprocess_IXI.sh  tests
antspy-0.1.7-cp36-cp36m-linux_x86_64.whl   create_env.sh       intensity_normalization           README.md          test.sh
ANTsPy-0.2.0.tar.gz                        create_env_user.sh  intensity_normalization.egg-info  requirements.txt   tutorials
antspyx-0.2.2-cp36-cp36m-linux_x86_64.whl  dist                LICENSE                           setup.py
build                                      Dockerfile          preprocess_IXI_124.sh             test_ext.py

Should I reinstall the whole intensity-normalization package as well as ants?

The nosetests didn't run. You must not have been inside the intensity_normalization directory. (Inside meaning when you run ls by itself you see the create_env.sh script.) Can you please try to do that again? It should run 30 tests.

Thanks for the extra info. FWIW, I wasn't able to reproduce this error, although I've encountered it before a while ago. It is probably due to a bad install of antspy. You may have to download the antspy directory and run python setup.py install (while in the intensity_normalization conda env) to build it from source.

Just FYI. The preprocess script really does very little. It was meant to be a simple convenience script for very basic pre-processing. If you have ANTs installed or something else to do bias field correction, you can (and should) just use that.

Let me know the output of the nosetests when you get the chance. Also, please run:

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img, convergence={'iters': [200, 200, 200, 200], 'tol': 0.0005})

where you fill in FILLTHISIN.nii.gz with a valid filename inside the test_3_samples_preprocess_in_mni directory. Also try:

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img)

To make sure the convergence criteria aren't bad (for some reason).

These tests will tell us if my script is doing something wrong or if the problem is in antspy.

Hi Jacob
I try both the commands you indicate here

import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img, convergence={'iters': [200, 200, 200, 200], 'tol': 0.0005})
import ants
img_fn = "/home/uqlliu10/nas_home/MRI_segmentation/IXI_samples_test/test_3_samples_preprocess_in_mni/FILLTHISIN.nii.gz"
img = ants.image_read(img_fn)
out = ants.n4_bias_field_correction(img)

they both exit exceptionally:

Process finished with exit code 139

I think it is the really the ants causing the problem.
Could you tell me how to correctly install the ants together with the intensity-normalization package. Should I still use the wheel of the ants?

Thanks. Please see: https://github.com/ANTsX/ANTsPy#installation for details on how to install antspy from source.

If you build from source (as described in the linked to instructions or as described in my previous post) and you still receive an error you should raise an issue with them because I am unable to replicate your error.

If you encounter a problem with my package after fixing the antspy install, please raise an issue. I appreciate the feedback.
Good luck!