LN2_MULTILATERATE perimeter_chunk is an empty output file
AtenaAkbari opened this issue · 11 comments
Hi LayNii team,
I'm getting an empty perimeter_chunk output when running the LN2_MULTILATERATE
, and thus cannot proceed with the flattening. Do you have any idea what I'm missing here?
thank you :)
Atena
Hi @AtenaAkbari ,
I am guessing that probably you are inputting an improper -rim
file or -control_points
file. From LN2_MULTILATERATE
help menu:
-rim : Segmentation input. Use 3 to code gray matter voxels
This program only injects coordinates to the voxels
labeled with 3.
-control_points : A middle gray matter nifti file generated by LN2_LAYERS
which is additionally modified (e.g. using ITK-SNAP or FSLEYES)
to follow one of the two cases:
CASE I: One midgm voxel labeled with value '2' indicates
the centroid/origin of the region of interest...
Also, see Figure 2 at https://thingsonthings.org/ln2_multilaterate/
Can you check your inputs and let us know if the problem persists?
Hi @ofgulban
Thanks for your reply. The rim file seems to be fine (value of 1 for CSF/GM boundary, 2 for WM/CSF, and GM area is filled with 3). I labeled one voxel of the midgm with the value of 2, and it shouldn't be saved as a mask, right? I saved the midgm as it is, but one voxel is labeled with the value of 2. I get the perimeter file, but the perimeter_chunk is an empty nifti file :(
I need the exact command you are using and the input files (if possible). Also, which LayNii version are you using?
@AtenaAkbari , any update on this?
Hi @ofgulban
sorry for the delay, I get the output files with case II of LN_MULTILATERATE but nothing with the case I. I'm not sure if I'm choosing the right voxels, would you mind taking a look at the attached files? Thank you :)
forFaruk.zip
I see, thanks for sending the files. The issue is that you are trying to apply LN2_MULTILATERATE on rim file with a single segmented slice. LN2_MULTILATERATE is built for 3D segmentations. You can check Ding2016_*
files within our test data folder to understand the expected input.
However, if you do want to analyze only a single slice of your image, I do recommend you to refer to the follwing blog post: https://layerfmri.com/2018/09/26/columns/ (see the last image that says IMAGIRO view).
One quick hack would be to just 'stack' the 2D slices on top of each other. So that LN2_MULTILATERATE solved a 2D problem as on axis in 3D space. In the example data from @AtenaAkbari This would look as the data/script attached. (The script has dependence to FSL)
fromAtena_2Dstack.zip.
Maybe this helps already?
.
@layerfMRI I don't think what you are suggesting is helpful aside from the purpose of playing with the LN2_MULTILATERATE
program. There can be many future repercussions of this "hack" depending on what @AtenaAkbari is planning to do in the following steps. I advise caution if this "hack" is going to be used seriously.
In a 2D "slice", you only have one "lateral" cortical surface axis to determine (as @layerfMRI does in the blog post referenced above). By using the "hack", yes we can generate 2D lateral cortical surface coordinates, but they will be redundant to represent the 1D surface axis along the 2D slice. Also note that, LN2_MULTILATERATE
algorithm can be tweaked to consider handling the 2D case. However, this would require some work to be implemented properly.
Thanks a lot @layerfMRI and @ofgulban for your help. I drew the ROI (the rim file) in itksnap this time, however, not all the slices contain the ROI. And for the landmark, I labeled the midgm in only one of these slices. Would you mind taking a look at the attached files to see if these landmark and rim files are "LN2_MULTILATERATE-friendly"? Thanks heaps :)
3D.zip
@AtenaAkbari , I see that you closed this issue, but what was your progress? It would be useful if you can tell us whether you have solved your issue, or decided to use something else, or realized that LN2_MULTILATERATE
was not doing what you thought it was doing. Thanks in advance.
Hi @ofgulban
sorry for the late response, yes the problem is solved. Thanks for your follow-up.