Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the quickstart #65

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Incorporate reviews
Rishit-dagli committed Dec 23, 2023
commit 39da83ce676f42764213c602d19ebde48816492d
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
12 changes: 8 additions & 4 deletions content/docs/machine-learning/quickstart.md
Original file line number Diff line number Diff line change
@@ -8,7 +8,7 @@ description: Learn how to get started with Civo's Kubeflow via an example.
<title>Kubeflow quickstart | Civo Documentation</title>
</head>

In this quickstart mainly focused towards Pipelines and Notebooks, we take a look at a simple example to train and run a model using Kubeflow Pipelines and Notebooks.
In this quickstart, focused mainly on Pipelines and Notebooks, we take a look at an example to train and run a machine learning model using Kubeflow.

## Before you start

@@ -18,7 +18,7 @@ Make sure you can create a notebook and log into the notebook by following the i

## Run a pipeline

We will first compile a pipeline DSL that will train a model using the MNIST dataset. Notice that this pipeline mainly has the following coponents which are run: hyperparameter tuning with Katib, creating a new volume for training, run a training job and finally serve the model using KServe.
We will first compile a pipeline DSL that will train a model using the [MNIST dataset](https://ieeexplore.ieee.org/document/726791). Notice that this pipeline mainly has the following coponents which are run: hyperparameter tuning with Katib, creating a new volume for training, run a training job and finally serve the model using KServe.

Now [create a new notebook instance](creating-a-new-kubeflow-notebook.md) and run the following commands:

@@ -31,7 +31,9 @@ python mnist-example.py

This produces for us a `mnist-example.yaml` file that we will use to run our pipeline.

We now use [Pipelines](kubeflow-dashboard.md/) and use the pipeline configuration we generated in the previous step to define the pipeline. Once you do so, you should create a new [experiment](kubeflow-dashboard.md/) and then trigger a run for the pipline by going to the Run tab, clicking on the "create a run" button and choosing the pipeline you just created.
We now use [Pipelines](kubeflow-dashboard.md/) and use the pipeline configuration we generated in the previous step to define the pipeline. Once you do so, you should create a new [experiment](kubeflow-dashboard.md/) and then trigger a run for the pipeline by going to the Run tab, clicking on the "create a run" button and choosing the pipeline you just created.

<img src={require('./images/mnist-pipeline.png').default} style={{"background-color":"white"}} />

This would trigger a Kubeflow pipeline run and we would see the pipeline run in the Pipelines dashboard. Once the pipeline run is complete, we can see the model saved in the `end-to-end-pipeline-{ID}-model` volume we created earlier.

@@ -66,4 +68,6 @@ response = requests.post(url, data=json_request)
print("Prediction for the image")
display(image)
print(response.json())
```
```

You should see the image and the JSON response our prediction endpoint returns for the image.