This repository packages the OfflineModel
in offline_model/model.py
for execution with Prefect using the flow described in offline_model/flow/flow.py
using the variables:
variable name | type | default |
---|---|---|
input1 | scalar | 1 |
input2 | scalar | 2 |
variable_name | type |
---|---|
output1 | scalar |
output2 | scalar |
output3 | scalar |
This package may be installed using pip:
pip install git+https://github.com/jbellister-slac/offline-model.git
Install dev environment:
conda env create -f dev-environment.yml
Activate your environment:
conda activate offline-model-dev
Install package:
pip install -e .
Tests can be executed from the root directory using:
pytest .
This README was automatically generated using the template defined in https://github.com/slaclab/lume-services-model-template with the following configuration:
{
"author": "Jesse Bellister",
"email": "jesseb@slac.stanford.edu",
"github_username": "jbellister-slac",
"github_url": "https://github.com/jbellister-slac/offline-model.git",
"project_name": "Offline Model",
"repo_name": "offline-model",
"package": "offline_model",
"model_class": "OfflineModel"
}