Loading tensorflow models from Amazon S3 with Tensorflow Serving
In this article, I am going to show you how to store a Tensorflow model in a file, upload it to Amazon S3, and configure the Docker image of Tensorflow Serving to serve that model via REST API. Before we start, we have to save a Tensorflow model in a file using the simple_save function. I'm going to assume that you have already trained your model. We need to specify the output directory and make sure that such a location exists. When the target directory is ready, we can call the simple_save function.
Sep-24-2019, 06:07:32 GMT
- Technology: