Using Bazel custom APT repository (recommended)
$ sudo apt-get install openjdk-8-jdk
$ echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
$ curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add -
If you want to install the testing version of Bazel, replace stable with testing.
$ sudo apt-get update && sudo apt-get install bazel
Once installed, you can upgrade to a newer version of Bazel with:
$ sudo apt-get upgrade bazel
$ sudo pip install grpcio
$ sudo apt-get update && sudo apt-get install -y \
build-essential \
curl \
libcurl3-dev \
git \
libfreetype6-dev \
libpng12-dev \
libzmq3-dev \
pkg-config \
python-dev \
python-numpy \
python-pip \
software-properties-common \
swig \
zip \
zlib1g-dev
$ pip install tensorflow-serving-api
$ git clone --recurse-submodules https://github.com/tensorflow/serving
$ cd serving
–recurse-submodules is required to fetch TensorFlow, gRPC, and other libraries that TensorFlow Serving depends on.
Note that these instructions will install the latest master branch of TensorFlow Serving. If you want to install a specific branch (such as a release branch), pass -b
to the git clone command.
Follow the Prerequisites section above to install all dependencies. To configure TensorFlow, run
$ cd tensorflow
$ ./configure
$ cd ..
$ bazel build -c opt tensorflow_serving/...
Binaries are placed in the bazel-bin directory, and can be run using a command like:
$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server
To test your installation, execute:
$ sudo pip install autograd # handle ImportError: No module named autograd
$ bazel test -c opt tensorflow_serving/...
$ rm -rf /tmp/mnist_model
$ bazel build -c opt //tensorflow_serving/example:mnist_saved_model
$ bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model
Training model...
...
Done training!
Exporting trained model to /tmp/mnist_model
Done exporting!
OR
$ python tensorflow_serving/example/mnist_saved_model.py /tmp/mnist_model
$ ls /tmp/mnist_model
1
$ ls /tmp/mnist_model/1
saved_model.pb variables
Each version sub-directory contains the following files:
$ bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server # do not use
$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/
$ bazel build -c opt //tensorflow_serving/example:mnist_client
$ bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000
...
Inference error rate: 10.4%
OR
$ python tensorflow_serving/example/mnist_client.py --num_tests=1000 --server=localhost:9000