Middle API using TensorFlow Serving.
See web documents accessing to /doc or /redoc
Single entry-point /api/v1/resnet
You could test the code executing pytest --cov=main
or docker run -h --rm -i -t ia_example:latest -c pytest --cov=main
,
also the code quality with flake8
- With Docker
docker build . --tag ia_example:latest
- With Docker-compose
docker-compose build
-
TENSORFLOW_PORT
(By default8500
)Allows to change the TensorFlow port
-
TENSORFLOW_API_REST_PORT
(By default8501
)Allows to change the TensorFlow Rest API port
-
TENSORFLOW_HOST
(By defaultlocalhost
)In order to allow internal or external TensorFlow services you could use that var to change the host used to process the prediction
-
TENSORFLOW_ARGS
(Empty by default)If
TENSORFLOW_HOST="localhost"
the args passed totensorflow_model_server
bin -
API_ARGS
(Empty by default)Args used to
uvicorn
- With Docker
docker run -d -p 8000:8000 --name ia_example -e MODEL_NAME=resnet -t ia_example:latest
- With Docker-compose
- NOTE : In docker compose will be deployed two containers, one with the API other with TensorFlow. Also include Prometheus monitoring system.
docker-compose up -d