[Updated 2/2/2018]
-
Libtensorflow_framework.so not found!
Locate file at
~/Desktop/tensorflow/bazel-bin/tensorflow
. Need to runsudo ldconfig
to refresh linker. -
Nsync.h not found!
Download at nsync git repos, and place it in the right folder
tensorflow/core/platform/default/
. -
Tensorflow version =
1.5.0-rc1/master
; Bazel version =0.9.0
; Protocbuff version =3.4.0
-
Remember to run
./configure
in tensorflow main repo before run bazel commandbazel build //tensorflow:libtensorflow_cc.so
. -
*.pb.h
missing.
libcupti.*
missing or noextras/CUPTI
related problem
- Install
libcupti8.0
andlibcupti-dev
from deb packages. See this post. - create etras/CUPTI directory and link it with libcupti libraries. See this post.
-
Install
protocbuf
,bazel
andeigen
.eigen
comes with VENTOS already, so no need to install it again.protobuf need to be installed from source code. Follow the instruction in here to install protobuf from source.
bazel
released version can be downloaded from here. Use the following command to installbazel
:sudo dpkg -i bazel_0.9.0-linux-x86_64.deb
As for tensorflow version v1.5.0-rc1 and bazel 0.9.0, only protobuf v3.4.0 works. (As Feb 2, 2018, bazel 0.10.0 does not work with tensorflow v1.5.0-rc1. Use bazel 0.9.0 instead.).
-
Install
cuda-8.0
andcudnn-6.1
. Follow this instruction. -
Start to build tensorflow libraries.
./configure
bazel build //tensorflow:libtensorflow_cc.so
-
Merge
bazel-genfiles/tensorflow
withtensorlow/tensorflow
-
Then Copy the following include headers and dynamic shared library to
/usr/local/lib
and/usr/local/include
(You are attensorflow
main folder):
cp -r tensorflow /usr/local/include/
cp -r third_party /usr/local/include/
cp -r bazel-bin/tensorflow/libtensorflow_cc.so /usr/local/lib/
cp -r bazel-bin/tensorflow/libtensorflow_framework.so /usr/local/lib/
- Lastly, compile using an example:
g++ -std=c++11 -o tTest test.cc -I/usr/local/include/tensorflow/contrib/makefile/downloads/absl -I/usr/local/include/tf -I/usr/local/include/eigen3 -g -Wall -D_DEBUG -Wshadow -Wno-sign-compare -w -L/usr/local/lib/libtensorflow_cc -ltensorflow_cc -L/usr/local/lib/libtensorflow_framework -ltensorflow_framework `pkg-config --cflags --libs protobuf`
-
Add
-I/usr/local/include/eigen3
toProperty->OMNeT++->Makemake->src->Options->Preview
. -
Add
tensorflow_cc
andtensorflow_framework
toProperty->OMNeT++->Makemake->src->Options->Link
. -
Include
tensorflow
head files.
-
In this post one can find an example how to load a graph into C++
-
In this post one can find a detailed example on how to save/load a checkpoint in Python
Note on exporting graph and model in tensorflow:
-
Simple tensorflow graph file *.pb only contains graph topology information. It doesn't contain any variables like weight or bias information.
-
Metagraph(checkpoint files) contains graph information in *.meta, and variable values in *.data and *.index files.
-
One can consolidate *.pb and *.meta/*.data/*.index into a single *.pb file which is known as frozen graph. (Python API provides a frozen_graph.py script to freeze a graph)
-
However, frozen graph can only be used for inference. One can reload the frozen graph in other applications and use it to do tasks like prediction. But it can't be used for further training. This is because the variables are converted to constants when the graph is frozen.
-
In order to make the saved graph applicable for continuing training, the following approach can be used (at least for TF version 1.3.0): load checkpoint files. This is so far the only possible way to reuse a pre-trained model for continuing training purpose.
Written with StackEdit.