/Gather-Tensorflow-Serving

Gather how to deploy tensorflow models using nginx, hadoop, kafka, flask, gunicorn, socketio, docker swarm, luigi spotify and so much more

Primary LanguagePythonMIT LicenseMIT

Gather-Tensorflow-Serving

Gather how to deploy tensorflow models as much I can

Covered

  1. Object Detection using Flask SocketIO for WebRTC
  2. Object Detection using Flask SocketIO for opencv
  3. Speech streaming using Flask SocketIO
  4. Classification using Flask + Gunicorn
  5. Classification using TF Serving
  6. Inception Classification using Flask SocketIO
  7. Object Detection using Flask + opencv
  8. Face-detection using Flask SocketIO for opencv
  9. Face-detection for opencv
  10. Inception with Flask using Docker
  11. Multiple Inception with Flask using EC2 Docker Swarm + Nginx load balancer
  12. Text classification using Hadoop streaming MapReduce
  13. Text classification using Kafka
  14. Text classification on Distributed TF using Flask + Gunicorn + Eventlet
  15. Text classification using Tornado + Gunicorn
  16. Celery with Hadoop for Massive text classification using Flask
  17. Luigi scheduler with Hadoop for Massive text classification

Technology used

  1. Flask
  2. Flask SocketIO
  3. Gunicorn
  4. Eventlet
  5. Tornado
  6. Celery
  7. Hadoop
  8. Kafka
  9. Nginx
  10. WebRTC
  11. Luigi Spotify

Printscreen

alt text

All folders contain printscreens or logs.

Notes

  1. Deploy them on a server, change local in code snippets to your own IP.
  2. WebRTC chrome only can tested on HTTPS server.
  3. When come to real deployment, always prepare for up-scaling architectures. Learn about DevOps.
  4. Please aware with your cloud cost!