Skip to content

Commit

Permalink
Merge branch 'kats/cer-2624-toml-config-docs' of https://github.com/C…
Browse files Browse the repository at this point in the history
…erebriumAI/documentation into kats/cer-2624-toml-config-docs
  • Loading branch information
Katsie011 committed Dec 7, 2023
2 parents 6baafcf + 5efa981 commit b445346
Show file tree
Hide file tree
Showing 4 changed files with 2 additions and 4 deletions.
2 changes: 1 addition & 1 deletion cerebrium/environments/initial-setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ The parameters for your config file are the same as those which you would use as
## Config File Example
```toml
# This file was automatically generated by Cerebrium as a starting point for your project.
# This file was automatically generated by Cerebrium as a starting point for your project.
# You can edit it as you wish.
# If you would like to learn more about your Cerebrium config, please visit https://docs.cerebrium.ai/cerebrium/environments/initial-setup#config-file-example
Expand Down
1 change: 0 additions & 1 deletion examples/langchain.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,6 @@ We then integrate Langchain with a Cerebrium deployed endpoint to answer questio

Your cerebrium.toml file is where you can set your compute/environment. Please make sure that the hardware you specify is a AMPERE_A5000, and that you have enough memory (RAM) on your instance to run the models. You cerebrium.toml file should look like:


```toml

[cerebrium.build]
Expand Down
1 change: 0 additions & 1 deletion examples/sdxl.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,6 @@ def predict(item, run_id, logger):

Your cerebrium.toml file is where you can set your compute/environment. Please make sure that the hardware you specify is a AMPERE_A5000 and that you have enough memory (RAM) on your instance to run the models. You cerebrium.toml file should look like:


```toml

[cerebrium.build]
Expand Down
2 changes: 1 addition & 1 deletion examples/transcribe-whisper.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,6 @@ In our predict function, which only runs on inference requests, we simply create

Your cerebrium.toml file is where you can set your compute/environment. Please make sure that the hardware you specify is a AMPERE_A5000 and that you have enough memory (RAM) on your instance to run the models. You cerebrium.toml file should look like:


```toml

[cerebrium.build]
Expand Down Expand Up @@ -157,6 +156,7 @@ openai-whisper
[cerebrium.requirements.conda]

```

To deploy the model use the following command:

```bash
Expand Down

0 comments on commit b445346

Please sign in to comment.