BioMedIA/dhcp-structural-pipeline

recon-neonatal-cortex error

alexamousley opened this issue · 8 comments

Hello,

I am trying to run the pipeline on infants from the Baby Connectome Project. Most, but not all participants receive an error during surface analysis. I have copied the surface.err file below as well as attaching the surface.log file.

Surface.err:
ERROR: In /usr/src/structural-pipeline/build/VTK/Filters/Core/vtkPolyDataConnectivityFilter.cxx, line 106
vtkPolyDataConnectivityFilter (0x1d7c160): No points!

ERROR: In /usr/src/structural-pipeline/build/VTK/Filters/Core/vtkPolyDataConnectivityFilter.cxx, line 106
vtkPolyDataConnectivityFilter (0x1da2910): No points!

Error: Could not find a closed intersection with finite cutting plane near segmentation boundary!
Traceback (most recent call last):
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/tools/recon-neonatal-cortex", line 938, in
check=args.check)
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/tools/recon-neonatal-cortex", line 587, in recon_neonatal_cortex
temp=temp_dir, check=check)
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/python/mirtk/deformable/neonatal_cortex.py", line 1201, in join_cortical_surfaces
'smoothing-iterations': 100, 'smoothing-lambda': 1})
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/python/mirtk/deformable/neonatal_cortex.py", line 183, in run
_run(tool, args=args, opts=opts, verbose=showcmd, threads=threads)
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/python/mirtk/subprocess.py", line 162, in run
check_call(argv, verbose=verbose)
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/python/mirtk/subprocess.py", line 124, in check_call
_call(argv, verbose=verbose, execfunc=subprocess.check_call)
File "/usr/src/structural-pipeline/build/MIRTK/build/lib/python/mirtk/subprocess.py", line 114, in _call
return execfunc(argv)
File "/usr/lib/python2.7/subprocess.py", line 541, in check_call
raise CalledProcessError(retcode, cmd)
CalledProcessError: Command '[u'/usr/src/structural-pipeline/build/MIRTK/build/lib/tools/merge-surfaces', '-smoothing-iterations', '100', '-source-array', 'RegionId', '-largest', 'True', '-smoothing-lambda', '1', '-output', '/imaging/opendata/HCP-BCP/analyses/infant_preprocessed/test_pipeline/workdir/sub-116056-ses-3mo/surfaces/sub-116056-ses-3mo/vtk/temp-recon/sub-116056-ses-3mo/cerebrum-1.vtp', '-snap-tolerance', '0.1', '-input', 'surfaces/sub-116056-ses-3mo/vtk/temp-recon/sub-116056-ses-3mo/cerebrum-rh.vtp', 'surfaces/sub-116056-ses-3mo/vtk/temp-recon/sub-116056-ses-3mo/cerebrum-lh.vtp', '-dividers', 'True', '-labels', 'surfaces/sub-116056-ses-3mo/vtk/temp-recon/sub-116056-ses-3mo/region-labels.nii.gz', '-tolerance', '1']' returned non-zero exit status 1
Error: recon-neonatal-cortex command returned non-zero exit status 1

The folder: '/surfaces/sub-116056-ses-3mo/vtk/' is empty besides the 'temp-recon' subfolder. The 'temp-recon' folder includes the files: cerebrum-lh.vtp, cerebrum-rh.vtp, and t2w-image.nii.gz. The command for surfaces appears to be also looking for 'region-labels.nii.gx', and 'cerebrum-1.vtp' that are not present in the folder. I am confused about why this is happening as I've confirmed that participants who have successfully made it through the pipeline have the same amount and type of input data as participants that have had this error.

Thanks in advance!

Best,
Lexi

sub-116056-ses-3mo.surface.log

Dear Alexamousley

It seems that we are preprocessing the same dataset and met the same issues during surface recon...Have you solved the problem yet?

Kind Regrads

Hello,

I'm glad to hear I am not the only one since it suggests it's not an issue with the pipeline setup on our end. Unfortunately, I have yet to find a solution. I am wondering if it could be an issue with the version of VTK but that is a bit of a shot in the dark. I will post if I find a solution, please let me know as well if you manage to get it working!

Best,
Lexi

Dear Lexi
Good afternoon!

It's soo lucky to find someone who in the similiar situation.
Whenever an approach is found,I will contact you.Now I am planning to try some other data to exclude the possibility of data problem and see whether they will report error in the same stage.
The BCP dataset in my hand is the 2019 released version and sooo disorder.(You must have known that...)
I once tried another pipeline to preprocess the dataset (due to the age) which is DCAN-Labs/infan-abcd-bids-pipeline which can be found in docker and github. And that pipeline failed also (though in bids) ..

Wish we GOOD LUCK and finish the preprocessing sucessfully!

Kind Regards
Zhao Yu

Dear Lexi
Good afternoon!

How's everything going?
I am not sure whether any progress has been made in using this pipeline,but I will discuss with you my current train of thought.

Firstly,I tried another several data (They are all from the BCP dataset , converted by dcm2nii) and received different error reports.Some of them are just the same with yours and some are not.
Here are the err log examples. (in .txt format)
MNBCP334326-session1.surface.err.txt
subject1-session1.surface.err.txt
As you can see, one of the err report is almost the same as yours (except the Generic Warning) , and the existing and missing files are exactly the same.
There also exist different files. In the subject1-session1 file,there is no cerebrum-lh.vtp and cerebrum-rh.vtp in workdir\subject1-session1\surfaces\subject1-session1\vtk\temp-recon\subject1-session1.
I am still wondering why the same step and parameter can produce different error reports.Maybe just as you say,it's due to the version of the VTK but it's all in a black box.

Then,I turned to the NeuroStar and search for dHCP-structural-pipeline to request some information.In one topic which named Problem with dHCP-structural-pipeline using only T2W image https://neurostars.org/t/problem-with-dhcp-structural-pipeline-using-only-t2w-image/23279,someone ask for the surface error report and get some answer. (But his error report is not the same like ours)The replier just said that his error may due to the style of data and one '-recon-from-seg' parameter maybe can help that.
image

I am going to add this parameter during the pipeline running and to see whether our problem will also fix by that.Unfortunately,I use the docker version of the pipeline,and in the docker version,the -recon-from-seg parameter is unavailable.Then I will try the bash version later and to check whether the error will disappear by adding the parameter (just a try.I hold no hope for that...)
And it seems that the one who answered the question is an expert of dhcp field,I am going to send message to consult the question later too (But his last post was on Dec 21, '22,maybe still will get no hope)

And the next contact with you will proceed my progress,it may be a bit far from now due to some busy things.
How's everything going in your side?Hope you can run the pipeline fluently.The spring has came.Wish everything goes well.

Kind Regards
Zhao

Hi both! Thanks for reaching out. I'm not the most active user in the project the past few years, but maybe I can shed some light on what is happening here.

The folder: '/surfaces/sub-116056-ses-3mo/vtk/' is empty besides the 'temp-recon' subfolder. The 'temp-recon' folder includes the files: cerebrum-lh.vtp, cerebrum-rh.vtp, and t2w-image.nii.gz. The command for surfaces appears to be also looking for 'region-labels.nii.gx', and 'cerebrum-1.vtp' that are not present in the folder.

This is probably expected and is no indication of the errors you see, nor that there would be anything wrong about the input data you are providing. Most temporary files are being deleted by the Python process implementing the surface reconstruction upon termination, whether it was successful or due to an error.

  • cerebrum-lh.vtp and cerebrum-rh.vtp are the initial surface meshes that were fit to the segmentation of left and right cortical structures separate. The input to this process is region-labels.nii.gz which is just a remapping and post-processing of the Draw-EM brain segmentation.
  • cerebrum-{i}.vtp are different stages of combining the two hemispheres into one single closed surface mesh. Subsequent steps of WM/cGM and cGM/CSF (pial) surface reconstructions are based on this combined mesh. This is to ensure that the final surface meshes for left and right hemispheres are disjoint, i.e., non-overlapping.

The "Error: Could not find a closed intersection with finite cutting plane near segmentation boundary!" relates to a step as part of the process that tries to combine the initial WM/cGM surfaces from the two hemispheres into a single surface. As part of this, it also includes the brain stem and cerebellum segmentations and tries to find a suitable planar cut for dividing the interior of that combined surface mesh into a) left hemisphere, b) right hemisphere, and c) brainstem and cerebellum. When your input data differs from the general shape found in neonates, the (generally hard-coded) settings for this step may indeed fail.

The Python script that is run as part of the structural pipeline is recon-neonatal-cortex as found in MIRTK (though maybe with some differences in which version of the code based on pipeline version). This command has a --debug option which you can use to keep intermediate files which may be helpful in debugging the issue. It would also be good if you can isolate the execution of this command by storing the inputs (outputs of preceding pipeline steps) and to then only run this command for one particular subject for which you observed the aforementioned error.

The Python function join_cortical_surfaces should be the one raising this error. In particular the merge-surfaces command of MIRTK here.

You would probably also by-pass the error using the --nocut option, though I am not too clear on what the implications of using this option would be for downstream tasks. It might be that the surface inflation and spherical mapping would be hampered and that it would generally break the structural image processing pipeline.

Dear Lexi Good afternoon!

How's everything going? I am not sure whether any progress has been made in using this pipeline,but I will discuss with you my current train of thought.

Firstly,I tried another several data (They are all from the BCP dataset , converted by dcm2nii) and received different error reports.Some of them are just the same with yours and some are not. Here are the err log examples. (in .txt format) MNBCP334326-session1.surface.err.txt subject1-session1.surface.err.txt As you can see, one of the err report is almost the same as yours (except the Generic Warning) , and the existing and missing files are exactly the same. There also exist different files. In the subject1-session1 file,there is no cerebrum-lh.vtp and cerebrum-rh.vtp in workdir\subject1-session1\surfaces\subject1-session1\vtk\temp-recon\subject1-session1. I am still wondering why the same step and parameter can produce different error reports.Maybe just as you say,it's due to the version of the VTK but it's all in a black box.

Then,I turned to the NeuroStar and search for dHCP-structural-pipeline to request some information.In one topic which named Problem with dHCP-structural-pipeline using only T2W image https://neurostars.org/t/problem-with-dhcp-structural-pipeline-using-only-t2w-image/23279,someone ask for the surface error report and get some answer. (But his error report is not the same like ours)The replier just said that his error may due to the style of data and one '-recon-from-seg' parameter maybe can help that. image

I am going to add this parameter during the pipeline running and to see whether our problem will also fix by that.Unfortunately,I use the docker version of the pipeline,and in the docker version,the -recon-from-seg parameter is unavailable.Then I will try the bash version later and to check whether the error will disappear by adding the parameter (just a try.I hold no hope for that...) And it seems that the one who answered the question is an expert of dhcp field,I am going to send message to consult the question later too (But his last post was on Dec 21, '22,maybe still will get no hope)

And the next contact with you will proceed my progress,it may be a bit far from now due to some busy things. How's everything going in your side?Hope you can run the pipeline fluently.The spring has came.Wish everything goes well.

Kind Regards Zhao

My apologies for the late reply! We were setting up and running the other dHCP pipeline, linked in the Neurostars post. Sadly, despite using the -recon-from-seg we received the same error with -recon-neonatal-cortex. Have you made any progress? Hope you are well!

Best,
Lexi

Dear Lexi Good afternoon!
How's everything going? I am not sure whether any progress has been made in using this pipeline,but I will discuss with you my current train of thought.
Firstly,I tried another several data (They are all from the BCP dataset , converted by dcm2nii) and received different error reports.Some of them are just the same with yours and some are not. Here are the err log examples. (in .txt format) MNBCP334326-session1.surface.err.txt subject1-session1.surface.err.txt As you can see, one of the err report is almost the same as yours (except the Generic Warning) , and the existing and missing files are exactly the same. There also exist different files. In the subject1-session1 file,there is no cerebrum-lh.vtp and cerebrum-rh.vtp in workdir\subject1-session1\surfaces\subject1-session1\vtk\temp-recon\subject1-session1. I am still wondering why the same step and parameter can produce different error reports.Maybe just as you say,it's due to the version of the VTK but it's all in a black box.
Then,I turned to the NeuroStar and search for dHCP-structural-pipeline to request some information.In one topic which named Problem with dHCP-structural-pipeline using only T2W image https://neurostars.org/t/problem-with-dhcp-structural-pipeline-using-only-t2w-image/23279,someone ask for the surface error report and get some answer. (But his error report is not the same like ours)The replier just said that his error may due to the style of data and one '-recon-from-seg' parameter maybe can help that. image
I am going to add this parameter during the pipeline running and to see whether our problem will also fix by that.Unfortunately,I use the docker version of the pipeline,and in the docker version,the -recon-from-seg parameter is unavailable.Then I will try the bash version later and to check whether the error will disappear by adding the parameter (just a try.I hold no hope for that...) And it seems that the one who answered the question is an expert of dhcp field,I am going to send message to consult the question later too (But his last post was on Dec 21, '22,maybe still will get no hope)
And the next contact with you will proceed my progress,it may be a bit far from now due to some busy things. How's everything going in your side?Hope you can run the pipeline fluently.The spring has came.Wish everything goes well.
Kind Regards Zhao

My apologies for the late reply! We were setting up and running the other dHCP pipeline, linked in the Neurostars post. Sadly, despite using the -recon-from-seg we received the same error with -recon-neonatal-cortex. Have you made any progress? Hope you are well!

Best, Lexi

Dear Lexi

Good Afternoon! I’m really sorry for taking soooo long to reply. Last month some accidents have happened upon me: I have fractured my left arm and then have done an operation.During last month, All time was spent in the hospital and I just have recovered and begin to work these days.Please accept my apology.

Meanwhile,another sad thing is that there actually goes no further in the problem. I still don’t know how to solve the problem or skip the error report.(Sorry for that again...).Before my accident,I was busy on reading some articles for one review and finish my homework.

But some new ideas do appear.Just like schuhschuh mentioned in Apr 5 ,the reason why it goes wrong may because the input data differs from neonates’ general shape. The range of BCP is from 0 to 5 years old. And the dHCP pipeline is prepared for neonates aged from 0 to 40wk. In my several trials, those aged between 0 to 40wk in BCP do actually can successfully run the dHCP pipeline.Maybe you can check whether the age of your successfully-preprocessed infants are in this period.
And I haven’t tried whether the --nocut option mentioned by schuhschuh can successfully solve the problem.But I ran the dHCP pipeline in docker previous,and maybe in order to add that option, the python scripts should be revised.I haven’t tried it yet. Maybe you can try to use the option and see what will go on.

What Schuhschuh says do inspire me a lot.Later I will go on to browse those pages to find where to go next. Hope it also helpful to you.
Sorry for my late reply again.Have you made any progress?Wish everything goes well.(And speaking from my experience, the most important thing is that taking care of health)

Kind Regards
Zhao

Hi both! Thanks for reaching out. I'm not the most active user in the project the past few years, but maybe I can shed some light on what is happening here.

The folder: '/surfaces/sub-116056-ses-3mo/vtk/' is empty besides the 'temp-recon' subfolder. The 'temp-recon' folder includes the files: cerebrum-lh.vtp, cerebrum-rh.vtp, and t2w-image.nii.gz. The command for surfaces appears to be also looking for 'region-labels.nii.gx', and 'cerebrum-1.vtp' that are not present in the folder.

This is probably expected and is no indication of the errors you see, nor that there would be anything wrong about the input data you are providing. Most temporary files are being deleted by the Python process implementing the surface reconstruction upon termination, whether it was successful or due to an error.

  • cerebrum-lh.vtp and cerebrum-rh.vtp are the initial surface meshes that were fit to the segmentation of left and right cortical structures separate. The input to this process is region-labels.nii.gz which is just a remapping and post-processing of the Draw-EM brain segmentation.
  • cerebrum-{i}.vtp are different stages of combining the two hemispheres into one single closed surface mesh. Subsequent steps of WM/cGM and cGM/CSF (pial) surface reconstructions are based on this combined mesh. This is to ensure that the final surface meshes for left and right hemispheres are disjoint, i.e., non-overlapping.

The "Error: Could not find a closed intersection with finite cutting plane near segmentation boundary!" relates to a step as part of the process that tries to combine the initial WM/cGM surfaces from the two hemispheres into a single surface. As part of this, it also includes the brain stem and cerebellum segmentations and tries to find a suitable planar cut for dividing the interior of that combined surface mesh into a) left hemisphere, b) right hemisphere, and c) brainstem and cerebellum. When your input data differs from the general shape found in neonates, the (generally hard-coded) settings for this step may indeed fail.

The Python script that is run as part of the structural pipeline is recon-neonatal-cortex as found in MIRTK (though maybe with some differences in which version of the code based on pipeline version). This command has a --debug option which you can use to keep intermediate files which may be helpful in debugging the issue. It would also be good if you can isolate the execution of this command by storing the inputs (outputs of preceding pipeline steps) and to then only run this command for one particular subject for which you observed the aforementioned error.

The Python function join_cortical_surfaces should be the one raising this error. In particular the merge-surfaces command of MIRTK here.

You would probably also by-pass the error using the --nocut option, though I am not too clear on what the implications of using this option would be for downstream tasks. It might be that the surface inflation and spherical mapping would be hampered and that it would generally break the structural image processing pipeline.

Dear schuhschuh

Sorry for the late reply! Last month my left arm was fractured and have spent the whole time in hospital.These days I just began to work and be able to reply.Please accept my apology!

And thanks a lot! What you say do inspire me a lot! I haven’t known the MIRTK before and only ran the pipeline in docker.And what you say about the .vtp file also helps me to understand.
Later I will go on browse the pages you mentioned.

Thanks for your help again!
Wish everything goes well.

Kind Regards
Zhao