tensorflow/recommenders-addons

TFRA integrates with tensorflow serving 1.15

liaocz opened this issue · 6 comments

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 18.04
  • TensorFlow version and how it was installed (source or binary): install tensorflow 1.15.2 from binary
  • TensorFlow Serving version and how it was installed: install tensorflow serving 1.15 from source code
  • TensorFlow-Recommenders-Addons version and how it was installed (source or binary): source
  • Python version: 3.6
  • Is GPU used? (yes/no): no

Describe the bug

We can't compile tensorflow-serving 1.15 correctly with TFRA according to the description from readme file

We have compiled TRFA with Tensorflow Serving 1.15 sucessfully and the process is as follows:

  • download TFRA source code
    git clone https://github.com/tensorflow/recommenders-addons.git
  • download Tensorflow-Serving source code
    git clone -b r1.15 https://github.com/tensorflow/serving.git
  • copy tensorflow_recommenders_addons and build-deps from TFRA to tensorflow-serving
    cp -r recommenders-addons/tensorflow_recommenders_addons serving/
    cp -r recommenders-addons/build_deps serving/
  • install tensorflow
    pip install tensorflow==1.15.2
  • generate .bazelrc for TFRA
    cd recommenders-addons/ && python configure.py
    change the TF_HEADER_DIR and FOR_TF_SERVINGin .bazelrc to
    TF_HEADER_DIR=/tensorflow-recommenders-addons/build_deps/tf_header/1.15.2/tensorflow
    FOR_TF_SERVING="1"
    so, the .bazelrc file under recommenders-addons shown as below
    build --action_env TF_HEADER_DIR="/tensorflow-recommenders-addons/build_deps/tf_header/1.15.2/tensorflow"
    build --action_env TF_SHARED_LIBRARY_DIR="/usr/local/lib/python3.6/dist-packages/tensorflow"
    build --action_env TF_SHARED_LIBRARY_NAME="libtensorflow_framework.so.2"
    build --action_env TF_CXX11_ABI_FLAG="0"
    build --action_env TF_VERSION_INTEGER="1152"
    build --action_env FOR_TF_SERVING="1"
    build --spawn_strategy=standalone
    build --strategy=Genrule=standalone
    build -c opt
    build --copt=-mavx
    
  • merge .bazelrc file
    cat .bazelrc >> ../serving/.bazelrc
  • merge WORKSPACE file
    1. delete the first line of WORKSPACE file under recommenders-addons directory
      workspace(name = "tf_recommenders_addons")
    2. merge with WORKSPACE file under serving directory
      cat WORKSPACE >> ../serving/WORKSPACE
  • modify serving/tensorflow-serving/model_servers file to integrate tfra operator
    1. adding OP information (tensorflow_text will not found, so ignore it)
      image
    2. adding linkopts to avoid multiple definition error
      image
  • compiling Tensorflow-serving using bazel

Of course,we have build docker image to deploy the model trained with TFRA sucessful.

The user will be confused when using the current document. So, Would it be better to re-describe how TFRA integrates with tensorflow serving 1.15 in the document or provide a Dockerfile for tensorflow-serving 1.15 with TFRA?

Hi @Mr-Nineteen, can you help?

@rhdong you can confirm the correctness above, if there is no problem, we can also assist in completing the document~

@liaocz hello

  1. If local_config_cuda is copied to the serving workspace, there should be conflicts in compilation?
  2. “adding linkopts to avoid multiple definition error”,When compiling in 2.4.1, it has not been encountered.

Others are OK.

@Mr-Nineteen @rhdong That's OK, so, need us to create pull request branch to update the document for integrating tensorflow-serving 1.15(cpu) and we also can validate for GPU later.

Hi, @liaocz. Is there further update on this issue?

Hi @liaocz, we'll close the issue for no updating, plz feel free to reopen if any problem, thank you !