Pipeline discovery data match application
Prototype application to facilitate data matching between pipeline inspections.
Project setup
Pre-requisites:
- postgres >= 9.5
- python >= 3.6
- Clone the repository:
$ git clone git@github.com:devonwalshe/pd.git && cd pd
- Install python dependencies:
$ pip install -r requirements.txt
- Bootstrap seed data
$ createdb pd
$ python bootstrap.py
- Start flask api server:
$ python -m api.api
- test API server response:
http://localhost:5000/pipelines
- Start frontend
cd frontend && PORT=3002 npm run dev
- Navigate to frontend
http://localhost:3002
Project Components
- API
- Python ORM layer peewee to handle data interface
- Python webserver flask-restful exposing a REST API
- Data
- Inspection run test data
- Matcher output test data
- Database dumps
- Frontend
- Frontend application facilitating the manual input for the data match (...)
- Matcher
- Python application powering the automated data match
- Tests
- application wide tests
Introspecting application models
The model definitions, including their attributes and relations are found in api/models/models.py
These don't necessarily make it clear which data is available on each item in the database, so following is instructions to get json like responses from each model:
- Start an interactive python session from the project root:
$ python
(Alternatively you can useipython
which can be installed withpip install ipython
) - Load in all the model definitions:
from api.models.models import *
- Get a single instance of any given model:
item = PipeSection.get_by_id(1)
(you can replacePipeSection
with any model from theapi/models/models.py
file) - Introspect the model relations and associated data:
model_to_dict(item)
ormodel_to_dict(item, recurse=False)
if you don't want to include the relations