From 2d11d91524a676a87985ff5e3fd441ffea76de0f Mon Sep 17 00:00:00 2001 From: Michael Katsoulis Date: Thu, 23 Nov 2023 15:03:55 +0000 Subject: [PATCH] Add info on the predict_data to the quickstart --- cerebrium/getting-started/quickstart.mdx | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/cerebrium/getting-started/quickstart.mdx b/cerebrium/getting-started/quickstart.mdx index 7eded3d2..821ad97e 100644 --- a/cerebrium/getting-started/quickstart.mdx +++ b/cerebrium/getting-started/quickstart.mdx @@ -48,6 +48,11 @@ You need to define a function with the name **predict** which receives 3 params: As long as your **main.py** contains the above you can write any other Python code. Import classes, add other functions etc. + +Take note of the parameters you've defined in your `Item` class. +These are the parameters that you will pass to your model when you make an API call to your model endpoint. You can define as many parameters as you like and name them as you see fit. Just make sure to update the `predict_data` in your **config.yaml** so that you can test your model with some sample data. Otherwise, disable testing by setting `disable_predict` to `true`. + + ### Deploy model Then navigate to where your model code (specifically your `main.py`) is located and run the following command: @@ -79,3 +84,5 @@ Below are some links outlining some of the more advanced functionality that Cort - [Persistent Storage](../data-sharing-storage/persistent-storage): Store model weights and files locally for faster access. - [Long Running Tasks](../deployments/long-running-tasks): Execute long running tasks in the background. - [Streaming](../endpoints/streaming): Stream output live back to your endpoint + +