/nshm-toshi-api

An extensible API where task metadata, and important input and output files relating to data-intensive science processes are retained. Custom task schemas can be defined to support their meta-data needs.

Primary LanguagePythonGNU Affero General Public License v3.0AGPL-3.0

# nshm-tosh-api

Where NSHM experiments and outputs are captured (not so old fashioned, tosh).

Getting started

virtualenv nshm-toshi-api
npm install --save serverless
npm install --save serverless-dynamodb-local
npm install --save serverless-s3-local
npm install --save serverless-python-requirements
npm install --save serverless-wsgi
sls dynamodb install

poetry install

running sls (alias for serverless )

NB to run the following examples, depending on local NPM configuration, it may be necessary to run npx serverless instead of sls.

Smoketest

poetry shell
sls dynamodb start --stage local &\
sls s3 start &\
SLS_OFFLINE=1 TOSHI_FIX_RANDOM_SEED=1 FIRST_DYNAMO_ID=0 sls wsgi serve

Then in another shell,

docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:6.8.0

(to just run locally stop here)

Then in another shell,

poetry run python3 graphql_api/tests/smoketests.py

Unit test

SLS_OFFLINE=1 TESTING=1 poetry run pytest

Test locally with Toshi UI

docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:6.8.0
sls dynamodb start --stage local &\
sls s3 start &\
SLS_OFFLINE=1 sls wsgi serve

then in another shell,

SLS_OFFLINE=1 S3_BUCKET_NAME=nzshm22-toshi-api-local S3_TEST_DATA_PATH=s3_extract python3 graphql_api/tests/upload_test_s3_extract.py 

then in the simple-toshi-ui repo, set REACT_APP_GRAPH_ENDPOINT=http://localhost:5000/graphql, and run yarn start

now if you navigate to http://localhost:3000/Find and find R2VuZXJhbFRhc2s6MjQ4ODdRTkhH you will get to your test data