Skip to content
This repository has been archived by the owner on Mar 2, 2022. It is now read-only.

How to offload OpenVINO non compliant layer to Tensorflow

PINTO edited this page Jan 14, 2019 · 21 revisions

Overview

Procedure

$ export TF_ROOT_DIR=/home/xxxx/tensorflow
$ sudo rm -rf tensorflow
$ sudo rm -rf /home/xxxx/.cache/bazel/_bazel_root
$ git clone -b v1.x.x https://github.com/tensorflow/tensorflow.git
$ cd tensorflow
$ git checkout -b v1.x.x
$ sudo -E /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh

Results

By Tensorflow version

tensorflow v1.4.1 + bazel 0.11.1 --- ".so" Something is wrong

$ mkdir bazel;cd bazel
$ wget https://github.com/bazelbuild/bazel/releases/download/0.11.1/bazel-0.11.1-dist.zip
$ unzip bazel-0.11.1-dist.zip;rm bazel-0.11.1-dist.zip
$ sudo bash ./compile.sh
$ sudo cp output/bazel /usr/local/bin
$ cd ~;mkdir work;cd work
$ sudo rm -rf tensorflow
$ sudo rm -rf /home/xxxx/.cache/bazel/_bazel_root

$ git clone -b v1.4.1 https://github.com/tensorflow/tensorflow.git
$ export TF_ROOT_DIR=/home/xxxx/work/tensorflow
$ cd $TF_ROOT_DIR
$ git checkout -b v1.4.1
$ ./configure
You have bazel 0.11.1- (@non-git) installed.
Please specify the location of python. [Default is /usr/bin/python]: /usr/bin/python3

Found possible Python library paths:
  /home/xxxx/git/tensorflow/models/research/object_detection
  /opt/intel//computer_vision_sdk_2018.5.445/python/python3.5
  /usr/local/lib
  /home/xxxx/git/caffe-jacinto/python
  .
  /opt/movidius/caffe/python
  /home/xxxx/git/tensorflow/models/research
  /opt/intel//computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer
  /opt/intel//computer_vision_sdk_2018.5.445/python/python3.5/ubuntu16
  /usr/lib/python3/dist-packages
  /usr/local/lib/python3.5/dist-packages
Please input the desired Python library path to use.  Default is [/home/xxxx/git/tensorflow/models/research/object_detection] /usr/local/lib/python3.5/dist-packages

Do you wish to build TensorFlow with jemalloc as malloc support? [Y/n]: y
jemalloc as malloc support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Google Cloud Platform support? [Y/n]: n
No Google Cloud Platform support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Hadoop File System support? [Y/n]: n
No Hadoop File System support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Amazon S3 File System support? [Y/n]: n
No Amazon S3 File System support will be enabled for TensorFlow.

Do you wish to build TensorFlow with XLA JIT support? [y/N]: n
No XLA JIT support will be enabled for TensorFlow.

Do you wish to build TensorFlow with GDR support? [y/N]: n
No GDR support will be enabled for TensorFlow.

Do you wish to build TensorFlow with VERBS support? [y/N]: n
No VERBS support will be enabled for TensorFlow.

Do you wish to build TensorFlow with OpenCL support? [y/N]: n
No OpenCL support will be enabled for TensorFlow.

Do you wish to build TensorFlow with CUDA support? [y/N]: n
No CUDA support will be enabled for TensorFlow.

Do you wish to build TensorFlow with MPI support? [y/N]: n
No MPI support will be enabled for TensorFlow.

Please specify optimization flags to use during compilation when bazel option "--config=opt" is specified [Default is -march=native]: 

Add "--config=mkl" to your bazel command to build with MKL support.
Please note that MKL on MacOS or windows is still not supported.
If you would like to use a local MKL instead of downloading, please set the environment variable "TF_MKL_ROOT" every time before build.
Configuration finished
$ nano /home/xxxx/work/tensorflow/tensorflow/workspace.bzl

#    if minimum_bazel_version > current_bazel_version:
#      fail("\nCurrent Bazel version is {}, expected at least {}\n".format(
#          native.bazel_version, bazel_version))

See (https://github.com/tensorflow/tensorflow/issues/15492)

$ sudo nano /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh

#bazel build --config=monolithic //tensorflow/cc/inference_engine_layer:libtensorflow_call_layer.so
bazel build --incompatible_load_argument_is_label=false //tensorflow/cc/inference_engine_layer:libtensorflow_call_layer.so
### Although an error always occurs, ignore it and proceed to the next step.
$ sudo -E /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh
$ sudo nano /home/xxxx/.cache/bazel/_bazel_root/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/external/io_bazel_rules_closure/closure/repositories.bzl

#    if minimum_bazel_version > current_bazel_version:
#      fail("%s requires Bazel >=%s but was %s" % (
#          project, bazel_version, native.bazel_version))
$ sudo -E /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh

tensorflow v1.5.1 + bazel 0.17.2 --- ".so" Something is wrong

$ mkdir bazel;cd bazel
$ wget https://github.com/bazelbuild/bazel/releases/download/0.17.2/bazel-0.17.2-dist.zip
$ unzip bazel-0.17.2-dist.zip;rm bazel-0.17.2-dist.zip
$ sudo bash ./compile.sh
$ sudo cp output/bazel /usr/local/bin

09

$ sudo nano /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh

#bazel build --config=monolithic //tensorflow/cc/inference_engine_layer:libtensorflow_call_layer.so
bazel build //tensorflow/cc/inference_engine_layer:libtensorflow_call_layer.so

See (https://github.com/tensorflow/tensorflow/issues/18450)
08

tensorflow v1.6.0 + bazel 0.17.2 --- See (https://github.com/tensorflow/tensorflow/issues/18450), ".so" Something is wrong

$ mkdir bazel;cd bazel
$ wget https://github.com/bazelbuild/bazel/releases/download/0.17.2/bazel-0.17.2-dist.zip
$ unzip bazel-0.17.2-dist.zip;rm bazel-0.17.2-dist.zip
$ sudo bash ./compile.sh
$ sudo cp output/bazel /usr/local/bin

08

tensorflow v1.7.1 + bazel 0.16.1 --- ".so" Something is wrong

2019.01.14 Retry

$ cd ~
$ mkdir bazel;cd bazel
$ wget https://github.com/bazelbuild/bazel/releases/download/0.16.1/bazel-0.16.1-dist.zip
$ unzip bazel-0.16.1-dist.zip
$ rm bazel-0.16.1-dist.zip
$ sudo bash ./compile.sh
$ sudo cp output/bazel /usr/local/bin
$ sudo rm -rf ~/.cache/bazel/_bazel_<username>
$ rm -rf ~/.bazel ~/.bazelrc
$ cd ~
$ mkdir git;cd git
$ git clone -b v1.7.1 https://github.com/tensorflow/tensorflow.git
$ cd tensorflow
$ git checkout -b v1.7.1
$ ./configure
$ sudo bazel build --config opt //tensorflow/tools/pip_package:build_pip_package
$ ./bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
$ sudo pip3 uninstall tensorflow-gpu tensorflow
$ sudo -H pip3 install /tmp/tensorflow_pkg/tensorflow-1.7.1-cp35-cp35m-linux_x86_64.whl
$ export TF_ROOT_DIR=/home/<username>/git/tensorflow
$ cd ~/git/tensorflow
$ sudo -E /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh
### Output path of ".so" file
/home/<username>/git/tensorflow/bazel-bin/tensorflow/cc/inference_engine_layer/libtensorflow_call_layer.so
$ ldd -r /home/<username>/git/tensorflow/bazel-bin/tensorflow/cc/inference_engine_layer/libtensorflow_call_layer.so
$ nm -CDu libtensorflow_call_layer.so | grep TensorDesc

tensorflow v1.9.0 + bazel 0.17.2 --- ".so" Something is wrong

tensorflow v1.11.0 + bazel 0.17.2 --- ".so" Something is wrong

tensorflow v1.12.0 + bazel 0.17.2 --- ".so" Something is wrong

$ sudo apt-get install python3-dev python3-pip libc-ares-dev
$ sudo -H pip3 install -U --user pip six numpy wheel mock
$ sudo -H pip3 install -U --user keras_applications==1.0.6 --no-deps
$ sudo -H pip3 install -U --user keras_preprocessing==1.0.5 --no-deps
$ cd ~
$ mkdir work;cd work
$ git clone -b v1.12.0 https://github.com/tensorflow/tensorflow.git
$ cd tensorflow
$ git checkout -b v1.12.0
$ ./configure

Please specify the location of python. [Default is /usr/bin/python]: /usr/bin/python3


Found possible Python library paths:
  /home/xxxx/git/tensorflow/models/research/object_detection
  /opt/intel//computer_vision_sdk_2018.5.445/python/python3.5
  /usr/local/lib
  /home/xxxx/git/caffe-jacinto/python
  .
  /opt/movidius/caffe/python
  /home/xxxx/git/tensorflow/models/research
  /opt/intel//computer_vision_sdk_2018.5.445/deployment_tools/model_optimizer
  /opt/intel//computer_vision_sdk_2018.5.445/python/python3.5/ubuntu16
  /usr/lib/python3/dist-packages
  /usr/local/lib/python3.5/dist-packages
Please input the desired Python library path to use.
Default is [/home/xxxx/git/tensorflow/models/research/object_detection] /usr/local/lib/python3.5/dist-packages
Do you wish to build TensorFlow with Apache Ignite support? [Y/n]: 
Apache Ignite support will be enabled for TensorFlow.

Do you wish to build TensorFlow with XLA JIT support? [Y/n]: 
XLA JIT support will be enabled for TensorFlow.

Do you wish to build TensorFlow with OpenCL SYCL support? [y/N]: 
No OpenCL SYCL support will be enabled for TensorFlow.

Do you wish to build TensorFlow with ROCm support? [y/N]: 
No ROCm support will be enabled for TensorFlow.

Do you wish to build TensorFlow with CUDA support? [y/N]: 
No CUDA support will be enabled for TensorFlow.

Do you wish to download a fresh release of clang? (Experimental) [y/N]: 
Clang will not be downloaded.

Do you wish to build TensorFlow with MPI support? [y/N]: 
No MPI support will be enabled for TensorFlow.

Please specify optimization flags to use during compilation when bazel option "--config=opt" is specified [Default is -march=native]: 


Would you like to interactively configure ./WORKSPACE for Android builds? [y/N]: 
Not configuring the WORKSPACE for Android builds.

Preconfigured Bazel build configs. You can use any of the below by adding "--config=<>" to your build command. See tools/bazel.rc for more details.
	--config=mkl         	# Build with MKL support.
	--config=monolithic  	# Config for mostly static monolithic build.
	--config=gdr         	# Build with GDR support.
	--config=verbs       	# Build with libverbs support.
	--config=ngraph      	# Build with Intel nGraph support.
Configuration finished

$ sudo /var/lib/dpkg/info/ca-certificates-java.postinst configure
$ sudo rm -rf /home/<username>/.cache/bazel/_bazel_<username>

$ bazel build \
--incompatible_remove_native_http_archive=false \
--incompatible_package_name_is_a_function=false \
--config=opt \
--define=grpc_no_ares=true \
//tensorflow/tools/pip_package:build_pip_package

$ ./bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg

$ sudo pip3 uninstall tensorflow-gpu tensorflow
$ sudo -H pip3 install /tmp/tensorflow_pkg/tensorflow-1.12.0-cp35-cp35m-linux_x86_64.whl
$ export TF_ROOT_DIR=/home/<username>/work/tensorflow
$ nano ~/.bashrc
source /opt/intel/computer_vision_sdk/bin/setupvars.sh #<--- Add

$ source ~/.bashrc

$ sudo nano /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh

#bazel build --config=monolithic //tensorflow/cc/inference_engine_layer:libtensorflow_call_layer.so
bazel build --incompatible_remove_native_http_archive=false //tensorflow/cc/inference_engine_layer:libtensorflow_call_layer.so

# bazel 0.17.2
$ sudo -E /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/tf_call_ie_layer/build.sh

Symbolic link status (".so" Something is wrong)

xxxx@ubuntu:~/git/tiny-yolo-tensorflow$ ldd -r libtensorflow_call_layer.so
	linux-vdso.so.1 =>  (0x00007fff2c9b5000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007ff5b4f7a000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007ff5b4d5d000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007ff5b4a54000)
	libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007ff5b46d2000)
	libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007ff5b44bc000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007ff5b40f2000)
	/lib64/ld-linux-x86-64.so.2 (0x00007ff5ba83c000)
undefined symbol: _ZN15InferenceEngine10TensorDescC1Ev	(./libtensorflow_call_layer.so)
undefined symbol: _ZNK15InferenceEngine4Data13getTensorDescEv	(./libtensorflow_call_layer.so)
undefined symbol: _ZN15InferenceEngine12BlockingDescC1ERKSt6vectorImSaImEES5_	(./libtensorflow_call_layer.so)
undefined symbol: _ZN15InferenceEngine10TensorDescC1ERKNS_9PrecisionESt6vectorImSaImEERKNS_12BlockingDescE	(./libtensorflow_call_layer.so)

Reference