NOTE: I am not maintaining this anymore since I got neuro_ansible working with Packer. The end result is the same, but Ansible is easier to maintain and tweak for your taste.
This repository contains a Dockerfile of Ubuntu for NeuroImaging.
It sets up NeuroDebian and installs:
- fsl-complete
- AFNI
- VTK
- ITK and SimpleITK (the code for this is commented for now)
- DCM2NIIX
- PETPVC
- ANTs
- SPM12 with MCR
- Python and the NiPy tools
- Neurita/boyle and pypes
-
Install Docker.
-
Clone this repository and
cd
into it.git clone https://github.com/Neurita/neuro_docker.git cd neuro_docker
-
Build the docker image.
docker build -t="dockerfile/neuro" .
After a successful installation, you can run the docker container and run your analysis.
docker run -it dockerfile/neuro
If you want to share with the container a folder path with data, you can run the following command:
docker run -it -v <host_path>:<guest_path> dockerfile/neuro
For example, if you have some data in /media/data/brains
and you would like it to be accessible in the container in /data
. You should run:
docker run -it -v /media/data/brains:/data dockerfile/neuro
This Dockerfile will setup a Conda Python environment with the Python dependencies for Pypes.
Once inside the container, to start using the Conda Python environment run:
source activate
The Dockerfile clears up the apt
repository index after installing the needed dependencies.
If you want to install more packages, first you have to recreate this index. To do this, run:
apt-get update
Remember to add the --rm
flag to the docker run
command if you don't want to store a new container after exiting it. This will save you disk space.
Have a better understanding of the docker run
command by running:
docker run --help