Skip to content
This repository has been archived by the owner on Feb 17, 2025. It is now read-only.

Commit

Permalink
chore: fix docs
Browse files Browse the repository at this point in the history
  • Loading branch information
lampajr committed Jun 7, 2024
1 parent 3d8b301 commit b106572
Show file tree
Hide file tree
Showing 2 changed files with 25 additions and 6 deletions.
5 changes: 3 additions & 2 deletions GET_STARTED.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,8 @@ OCI protocol KServe custom storage initializer.

### Build local oci-storage-initializer

> **NOTE**: You can skip this step if you want to use an existing container image
> [!NOTE]
> You can skip this step if you want to use an existing container image
```bash
export VERSION=<replace>
Expand Down Expand Up @@ -49,7 +50,7 @@ We assume all [prerequisites](#prerequisites) are satisfied at this point.
```bash
kind load docker-image quay.io/${USER}/oci-storage-initializer:${VERSION}
```
> **NOTE**: Skip this step if you are using a publicly available image
> Skip this step if you are using a publicly available image

## First InferenceService with OCI artifact URI

Expand Down
26 changes: 22 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,26 @@
# OCI KServe Custom Storage Initializer

This repository provides an example of [KServer custom storage initializer](https://kserve.github.io/website/latest/modelserving/storage/storagecontainers/)
that showcases how users can automate a machine learning model deployment stored as OCI artifact using KServe.
Have you ever wondered to deploy an ML model by directly referencing an OCI artifact that contains your model?

> **NOTE**: This repository was mainly intended for demoing purposes
This repository provides an example of [KServe custom storage initializer](https://kserve.github.io/website/latest/modelserving/storage/storagecontainers/)
that showcases how users can automate the deployment using Kserve of an ML model that is stored as OCI artifact.

Users can then create an `InferenceService` like this one and the job is done :smile:
```yaml
apiVersion: "serving.kserve.io/v1beta1"
kind: "InferenceService"
metadata:
name: "sklearn-iris"
spec:
predictor:
model:
modelFormat:
name: sklearn
storageUri: "oci-artifact://quay.io/<path-to-your-greatest-model>"
```
> [!NOTE]
> This repository was mainly intended for demoing purposes
The implementation was inspired by the default [KServe storage container](https://github.com/kserve/kserve/blob/1c51eeee174330b076e4171e6d71e9138f2510b3/python/kserve/kserve/storage/storage.py),
this means that its integration should be pretty straightforward if required at some point.
Expand All @@ -21,7 +38,8 @@ Prerequisites:
* Python
* Poetry
> **NOTE**: I would suggest to use a Python virtual environment as development environment
> [!NOTE]
> I would suggest to use a Python virtual environment as development environment
Install the dependencies:
```bash
Expand Down

0 comments on commit b106572

Please sign in to comment.