Few days ago, I wrote about how to retrieving signature of an exported model from Tensorflow, today i want to continue with how to export a model for serving. Particularly, exporting a model and serve it with TFServing. TFServing is a high performance tensorflow serving service written in C++. I am working on building a serving infrastructure, so i have to spend a lot of time on exporting tensorflow model and make it servable via TFServing.
The requirement for an exported model to be servable by TFServing is quite simple: you need to define inputsand outputs named signatures. The inputs signature will define the shape of the input tensor of the graph, and the outputs signature will define the output tensor of the prediction.
Exporting from a tensorflow graphThis is straight forward. If you build the graph yourself, you will have the inputs and outputs tensor. You will just need to create a Saver and an Exporter, then call with the right arguments.
saver = tf.train.Saver(sharded=True)
model_exporter = exporter.Exporter(saver)
model_exporter.init(
sess.graph.as_graph_def(),
named_graph_signatures={
'inputs': exporter.generic_signature({'images': x}),
'outputs': exporter.generic_signature({'scores': y})})
model_version_path = model_exporter.export('/tmp/mnist_exported_model', tf.constant(FLAGS.export_version), sess)
Please see here for a complete example.
Exporting from a tf.contrib.learn EstimatorThis is actually more tricky. Even the estimator providesexport() API but the documentation is not helpful, and by default it won’t export a named signature so you can not use it directly. Instead, you will need to:
Define a input_fn
to return the shape of the input. You can reuse your input_fn for data feeding if you have already did that during training.
Define a signature_fn
as below
Make sure you pass input_feature_key
anduse_deprecated_input_fn=False
when you call the export function.
Below is an example of exporting the classifier from thistutorial. Note: this is only for tensorflow 0.11. For 0.12 and 1.0 the api may be different.
from tensorflow.contrib.learn.python.learn.utils import export
from tensorflow.contrib.session_bundle import exporter
def my_input_fn():
# Here you define the shape of input tensors, it needs to match with the
# shape of the feature you feed into the estimator
return {
"": tf.placeholder(tf.float32, shape=[None, 4])
}, None
def my_signature_fn(examples,features,predictions):
return None,{
"inputs": exporter.generic_signature({"features": examples}),
"outputs": exporter.generic_signature({"score": predictions})
}
model_version_path = classifier.export(
"/tmp/iris_exported_model",
input_fn=my_input_fn,
input_feature_key="",
use_deprecated_input_fn=False,
signature_fn=my_signature_fn
)
Some explanation: in the input_fn
you defined the features of your estimator, it will return a dict of tensors to represents your data. Usually this will return a tuple of features tensors and labels tensor, but for exporting you can skip the label tensor. You can refer to here for detail documentation. The above input_fn
returns a feature tensor with feature name is empty string (“”). That’s why we also need to add input_feature_key=""
to the export function.
Once the model is exported, you can just ship it to TF Serving and start serving it. I will continue with this series next few days on how to run the serving service and sending requests into it.