保存模型的签名信息&serving

tensorflow serving starts supporting the RESTful API feature at the version 1.8 in addition to gRPC API.

1,入手模型文件(pb)。如:

curl -s https://storage.googleapis.com/download.tensorflow.org/models/official/20181001_resnet/savedmodels/resnet_v2_fp32_savedmodel_NHWC_jpg.tar.gz | tar --strip-components=2 -C /tmp/resnet -xvz

2,检查pb文件

$ docker run -it -v /tmp:/tmp tensorflow/tensorflow bash

# in container

root@e8d3550addfa:/tmp# saved_model_cli show --dir /tmp/resnet/1538687457/ --all

3,用docker启动tf serving

docker run -p 8500:8500 -p 8501:8501 --name tfserving_resnet \--mount type=bind,source=/tmp/resnet,target=/models/resnet \-e MODEL_NAME=resnet -t tensorflow/serving &

4,从host用curl检查pb metadata

curl localhost:8501/v1/models/resnet/metadata

5,使用rest api

Classify and Regress API

predict API

When there is only one named input, specify the value of instances key to be the value of the input:

{  // List of 3 scalar tensors.

  "instances": [ "foo", "bar", "baz" ] }

”foo" 可以是 { "b64": "aW1hZ2UgYnl0ZXM=" } // base64 encode

www.tensorflow.org/serving/api_rest

6,也可用kubeflow tf-serving部署model

你可能感兴趣的:(保存模型的签名信息&serving)