Hosting Models with TF Serving on Docker
Training a Machine Learning (ML) model is only one step in the ML lifecycle. There's no purpose to ML if you cannot get a response from your model. You must be able to host your trained model for inference. There's a variety of hosting/deployment options that can be used for ML, with one of the most popular being TensorFlow Serving. TensorFlow Serving helps take your trained model's artifacts and host it for inference.
Jun-10-2022, 08:26:53 GMT
- Technology: