A scalable distributed web application and API for asynchronous execution of NPAmp jobs on a cluster of workers
This project uses Docker to deploy the following services:
- Django and Django REST Framework-based REST API
- Celery-based job executors
- RabbitMQ as a message broker
- PostgreSQL for user and job storage
- MongoDB GridFS for result storage
- Redis for caching and session storage
- Nginx for proxying and serving static files
For development environments, before deploying, run bin/mkdevenv.sh
.
For test environments, before deploying, run bin/mktestenv.sh
.
For production environments, before deploying, set ALLOWED_HOSTS
in docker-compose.django.yml
to the correct comma-separated list of values.
To deploy, first install docker-engine
and docker-compose
. Optionally, to skip building and deploy from the last tested image (which, however, may be out of sync with the scripts and configuration), run sudo docker pull vsemionov/hippo
. Finally, run bin/deploy.sh
.
The configuration is contained in:
requirements.txt
- python dependencies and versionsDockerfile
- container image build proceduredocker-compose.*
- service configuration and dependencieshippo/hippo/settings.py
- main django-based service configurationhippo/nginx.conf
- nginx configuration templatehippo/*.sh
- service startup scriptshippo/hippo/*.conf
- additional per-service django configuration