Getting layers in EPI space
giobattistella opened this issue · 6 comments
Hi,
I followed the steps described in https://layerfmri.com/2017/11/26/getting-layers-in-epi-space/ to extract layers from the MP2RAGE in the EPI space. Everything seemed to work fine till the transformation of pial surface from Freesurfer to volume with SUMA. It looks like that the pial rim is "scattered" in some portions of the brain, thus impeding the layering of the cortex. The WM rim looks fine instead.
I am happy to share with you the data if those might help you understand better the problem I am having.
Best,
Giovanni
Hi Giovanni,
Yes, I think I know what you mean. You mean an artifact like this?
(artifact from @kenshukoiso)
This is a feature of working in vertex space. It suggests that the vertex density is not high enough to capture all voxels.
If you increase the vertex density, this effect will be reduced.
E.g. for the MapIcosahedron
command, you can use the option -ld 2000
. Note that this will increase the file size of the surfaces.
Note that the blog post that you are referring to is from 2017. LayNii has improved quite a bit since then.
I would recommend you to use the new program LN2_LAYERS
instead in step 5.
Best regards,
Renzo
@giobattistella , what is your progress on this issue? It would be useful if you can let us know whether your issue is solved. Thanks in advance.
Hi everybody,
I am really sorry for the delay in getting back to you but I had to work urgently on something else in the past weeks. As Renzo suggested, increasing the vertex density improves the results. I tried with -ld1200
and the results are better than my original attempt at -ld564
but still not good enough. The problem that I have is that when I try -ld2000
the command crashes; I got the message "Killed" and not outputs are stored. Could this be due to the size of the output files that should be written? Do you have any ideas on how to solve this problem? thanks a lot for our help.
This sounds like a freesurfer problem therefore out of scope for us. If @layerfMRI does not have any suggestions I will close the topic.
@kenshukoiso has looked into the number of vertices a bit for a representative while brain dataset with voxel sizes of 0.4mm.
This shows that 2000k (2.000.000) vertices (NOTE: The number refers to the argument -ld
in suma. As far as I understand, it refers to the number of vertices per hemisphere in units of 1000, default is -ld 64
) per hemisphere are necessary in order not to miss any voxels. This means that the mesh file is 47 GB big!
And it takes a lot of time to work with it...
Closing due to no response. Feel free to reopen in the future.