API for linear regrassion model Uses the Titanic dataset from kaggle.com. For prediction json is passed in the request. Predicts the survival probability of a passenger.
$ virtualenv api -p python3
$ cd api && source bin/activate
$ git clone https://github.com/nandini8/LR_api.git
$ cd LR_api
$ pip install -r requirements.txt
$ python scripts/app.py
For training: /train
For predicting: /predict
Run training before predicting.
Prediction demo data:
[{"Age": 85, "Sex": "male", "Embarked": "S"},
{"Age": 24, "Sex": "female", "Embarked": "C"},
{"Age": 3, "Sex": "male", "Embarked": "C"},
{"Age": 21, "Sex": "male", "Embarked": "S"}]
docker build -t tag-name .
docker run -d -p 5000:5000 tag-name
- Create an EC2 instance
- To launch an existing EC2 instance. Note down:
- image id ami-xxxxxx
- instance type
- key-name xxx.pem
- security group id sg-xxxxx
- subnet id subnet-xxxxx
aws ec2 run-instances --image-id ami-xxxx --count 1 --instance-type t2.micro --key-name file-name --security-group-ids sg-xxxx --subnet-id subnet-xxxx
- Connect to the instance
ssh -i "path/to/xxx.pem" user@public-dns.compute.amazonaws.com
apt-get update
apt-get install docker.io
orapt-get install docker
git clone https://github.com/nandini8/LR_api
cd LR_api
docker build -t ml_api .
docker run -d -p 80:5000 ml_api
- Will make the /train endpoint POST method too so that custom dataset can be passed for training
- Can be done for multiple ML models and not restricted to just one.