Skip to content

Latest commit

 

History

History
53 lines (29 loc) · 1.4 KB

README.md

File metadata and controls

53 lines (29 loc) · 1.4 KB

KFServing Examples

Deploy KFServing InferenceService with out of the box Predictor

SKLearn Model

PyTorch Model

Tensorflow Model

XGBoost Model

ONNX Model with ONNX Runtime

TensorRT Model with NVIDIA's TensorRT Inference Server

Deploy KFServing InferenceService with a custom Predictor

Hello World Flask Server

KFServing Custom Model

Prebuilt Image

Deploy KFServing InferenceService with Transformer

Image Transformer with PyTorch Predictor

Deploy KFServing InferenceService with Explainer

Alibi Image Explainer

Alibi Text Explainer

Alibi Tabular Explainer

Deploy KFServing InferenceService with Cloud or PVC storage

Models on S3

Models on PVC

Models on Azure

Deploy KFServing InferenceService with Autoscaling, Canary Rollout and Other Integrations

Autoscale inference workload on CPU/GPU

InferenceService on GPU nodes

Canary Rollout

InferenceService with Kubeflow Pipeline

InferenceService with Request/Response Logger

InferenceService with Kafka Event Source