Article Source
Deploy Model to KServe
Abstract
In this tutorial, we take a model container built using the Chassis.ml service and walk through the steps required to deploy the model to KServe. KServe is an open-source model inferencing platform that runs on top of Kubernetes.
Kubeflow Tutorial | Model Serving
In this video, I walk you through a simple model engineering process using Kubeflow Fairing (Note: nowadays, there is a better way to serve models, with Kubeflow Serving)