Serving Services for All ML Model using FastAPI and Docker.
Here is the list of ML Model that we use in this project:
Using ML Model Serving, you can:
- Get prediction from our ML model for predicting food category, rating, and price.
- Get food recommendation from our ML model by given
user_id
. - Get status and metadata from our ML model.
- Python 3.10 or higher
- FastAPI 0.104.1 or higher
- Docker 24.0.7 or higher
- Docker Compose 2.15.1 or higher
Actually, if you already have Docker and Docker Compose, you just need the compose.yml file.
- You can run this app with Docker Compose.
docker compose up
- Now, you can access the app on http://localhost:8080
- For doing prediction or get food recommendation, please read the API Endpoints section.
If you want to develop the model serving, you can follow this step:
- Clone the repository:
git clone https://github.com/MamMates/ml-model-serving.git
- Create and activate virtual environment:
python -m venv venv
source venv/bin/activate
- Install dependencies:
pip install -r requirements.txt
- To develop the FastAPI services, you can modify the
compose.yml
file, change theimage
inapp
services tobuild
:
- image: putuwaw/mammates-model-serving
+ build: .
- Run the app:
docker compose up --build
-
Now, you can access the app on http://localhost:8080
-
You can also run the test using
pytest
:
pytest
List of available endpoints:
GET /
- Get hello world.
Response
{
"status": true,
"code": 200,
"message": "OK",
"data": {
"message": "Hello World"
}
}
GET /recommendation
- Get food recommendation (top 5).
Name | Params | Required | Type | Description |
---|---|---|---|---|
user_id |
Query | required | integer |
The id of user. Example 14 |
Response
{
"status": true,
"code": 200,
"message": "OK",
"data": {
"food_id": [13, 14, 12, 2, 18]
}
}
GET /model/{model_name}
- Get status and metadata from model.
Name | Params | Required | Type | Description |
---|---|---|---|---|
model_name |
Path | required | string |
The name of model. Example food_clf |
Response
{
"status": true,
"code": 200,
"message": "OK",
"data": {
"status": {
"model_version_status": [
{
"version": "1",
"state": "AVAILABLE",
"status": {
"error_code": "OK",
"error_message": ""
}
}
]
},
"metadata": {
"model_spec": {
"name": "food_clf",
"signature_name": "",
"version": "1"
},
"metadata": {
"signature_def": {
"signature_def": {
"serving_default": {
"inputs": {
"input_2": {
"dtype": "DT_FLOAT",
"tensor_shape": {
"dim": [
{
"size": "-1",
"name": ""
},
{
"size": "150",
"name": ""
},
{
"size": "150",
"name": ""
},
{
"size": "3",
"name": ""
}
],
"unknown_rank": false
},
"name": "serving_default_input_2:0"
}
},
"outputs": {
"dense": {
"dtype": "DT_FLOAT",
"tensor_shape": {
"dim": [
{
"size": "-1",
"name": ""
},
{
"size": "10",
"name": ""
}
],
"unknown_rank": false
},
"name": "StatefulPartitionedCall:0"
}
},
"method_name": "tensorflow/serving/predict",
"defaults": {}
},
"__saved_model_init_op": {
"inputs": {},
"outputs": {
"__saved_model_init_op": {
"dtype": "DT_INVALID",
"tensor_shape": {
"dim": [],
"unknown_rank": true
},
"name": "NoOp"
}
},
"method_name": "",
"defaults": {}
}
}
}
}
}
}
}
POST /predict
- Predict category, rating, and price of image.
Name | Params | Required | Type | Description |
---|---|---|---|---|
province |
Query | required | string |
The province of seller. Example Bali |
environment |
Query | optional | string |
The environment of seller. Default campus . |
name |
Query | optional | string |
The name of food. Default null . Example donat ubi mawar |
image |
Body | required | file |
The image to predict |
Response
{
"status": true,
"code": 200,
"message": "OK",
"data": {
"category": 2,
"rating": 3,
"price": 10000
}
}
This project is licensed under the MIT License. See the LICENSE file for details.