The Hantman lab investigates skilled motor control through a head-fixed reach-to-grab task in mice. Over the last few years, a large ground truth dataset has been developed for classifying prominent features of the behavior (lift
, handopen
, grab
, supinate
, atmouth
, chew
).
The goal of this project is to create a mega-classifier for the Hantman lab to use for future automated behavioral classification.
https://animal-soup.readthedocs.io/
For more detailed instructions please see here
You will need to have Docker Desktop installed.
# clone the repo
git clone https://github.com/hantman-lab/animal-soup.git
cd animal_soup
# build the docker image
docker build -t ansoup .
# run the docker image
docker run --gpus all -w /animal-soup -it --rm -p 8888:8888 -v /home:/home ansoup
# launch jupyter lab from running container on `localhost:9000`
jupyter lab --allow-root --ip=0.0.0.0
The -v /home:/home
assumes that the filesystem you want to mount (where your behavioral data is located) is under
a directory called /home
. If your data is located somewhere else you will need to change the mount path when you run the container.
Mount paths should be in the form -v /your/local/file/directories:/container/file/structure
.
Note: You will only need to build the Docker image once. After you have built the image the first time, you will only need to execute the run command to start the container.
Important: A running docker container will not save changes across different runnings of the container. This means that when you stop the docker container instance any changes made to files in the docker environment will not persist when you run the docker container again. However, this DOES NOT apply to mounted files. By default your behavior prediction will be saved under your parent_data_path
that you set before running inference. The parent_data_path
will be located under the mounted file volume so this will not be an issue. You just need to make sure any jupyter notebooks that you want saved are located under the mounted file volume and NOT in the filesystem of the container!
Using pandas.DataFrame
to organize Hantman Lab behavioral data.
Modeled after DeepEthogram.
animal-soup
has pre-trained models that were initially trained with 1000 ground truth videos and hand-labeled ethograms.
These models were specifically trained for the Hantman Lab reach-to-grab task.
Please see the demo notebook for downloading the pre-trained models from Zenodo.
Using pydatagrid
and fastplotlib
for visualization of behavior with corresponding ethograms.
Using fastplotlib
in order to clean ground truth ethograms for model training later.