Jina-Practice-Codes

Jina lets you build multimodal ai services that communicate via gRPC

Jina provides a smooth Pythonic experience for serving ML models transitioning from local deployment to advanced orchestration frameworks like Docker-Compose, Kubernetes, or Jina AI Cloud.

FastAPI communication relies on Pydantic and Jina relies on DocArray allowing Jina to support multiple protocols to expose its services.

Jina provides a smooth transition from local development (using DocArray) to local serving using Deployment and Flow to having production-ready services by using Kubernetes capacity to orchestrate the lifetime of containers.

  1. pip install -U jina

  2. Start by building the executor code Executor allows you to build a fast, scalable , reliable g-RPC based AI service

  3. Write the deployment code using python api or YAML

  4. for yml file run : jina deployment --uses deployment.yml