/djl-akka-http

Example of using DJL.ai in Akka-Http server for inference

Primary LanguageScalaApache License 2.0Apache-2.0

An example of how to use Deep Java Library DJL.ai in Scala's Akka-Http framework

The endpoint of POST /inferences

{"text":"whatever"} 

which shoud compute text embedding and return embedding in string format

{
    "vector": "Array(-0.026074253, -0.08460002, ...,"
}

or cURL

curl --location --request POST 'http://127.0.0.1:8080/inferences' \
--header 'Content-Type: application/json' \
--data-raw '{"text": "whatever"}'

Install SBT, for macOS - https://www.scala-sbt.org/1.x/docs/Installing-sbt-on-Mac.html

Run service:

sbt run

You should see Server online at http://127.0.0.1:8080/

Run unit tests:

sbt test

For TensorFlow to optimize on performance:

export OMP_NUM_THREADS=1
export TF_NUM_INTEROP_THREADS=1
export TF_NUM_INTRAOP_THREADS=1

For more information on optimization, you can check here.