SCRIPT/
webserver/ ---- Django REST Framework web server
Dockerfile
manage.py
app/ ---- settings
script/ ---- script web app
frontend/ ---- React
Dockerfile
src/ ---- source code
s3watch/ ---- Watch the algorithm results, trigger endpoint to update db
ec2setup/ ---- code running on EC2
utils/ ---- Utils which can be copied by all images during image build
aws/
terraform/ ---- terraform configuration
upload/ ---- shell/python script to split raw data and upload
mosek/ ---- mosek license
docker-compose.yml ---- Docker compose config
variable.env ---- Configuration for environment variables
run.py ---- One-key script to start the whole project
I used anaconda to manage my environment. However, you can easily replicate this with virtualenv. Docker instructions are below...
# install postgres first to avoid headaches
brew install postgresql
# to start the DB you can just:
brew services start postgresql
# create the needed environemnt
conda create -n venv_script python=3.7
conda activate venv_script
# install the backend dependencies
cd webserver
pip install -r requirements.txt
# other parts of the project requires other dependencies....
# instead of going through them and figuring out that you need those dependencies
# when your code breaks, just install them now:
pip install celery pandas cvxpy sklearn
# install the frontend dependencies
cd ../frontend
yarn install
Make sure you have postgres installed locally.
Create a database named scriptdb
- I used TablePlus to create a DB with that name on my localhost
. You can easily achieve the same thing via the cmd line. Also, connection params for development are the postgres defaults. You can also check the settings file to find them: webserver/app/settings/base.py
# migrate the DB
cd webserver
python manage.py migrate --settings=app.settings.base
Running the project:
cd webserver
# project assumes localhost:8000 - which should be the default
python manage.py runserver --settings=app.settings.base
cd ../frontend
yarn start
Your browser should launch automatically and point to localhost:3000
- install
docker
anddocker-compose
- install
aws cli
andterraform
- configure
./variables.env
and please make sure the resource names will not conflict with existing resources
- generate ssh key pair(pem) with the key name of
script
and download the key - copy it to
./utils/aws/terraform/
- enter
./utils/aws/terraform/
and runchmod 400 script.pem
pip install paramiko
pip install pytz
- Run with existing AWS resources which are properly configured
- Configure
./variables.env
with existing resources - EC2 instance and S3 bucket:
python run.py -i <ec2_ip> -d <db_host>
- Configure
- Launch new resources and run:
python run.py
- For more help:
python run.py --help