Skip to content

Commit

Permalink
[website] remove deployment button from blog posts
Browse files Browse the repository at this point in the history
  • Loading branch information
javierluraschi committed Jan 6, 2025
1 parent 769694c commit ccb472a
Show file tree
Hide file tree
Showing 11 changed files with 22 additions and 21 deletions.
2 changes: 1 addition & 1 deletion website/blog/2025-06-01-Hal9_DSPy/DSPy.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ At the heart of DSPy are **signatures**, which specify what an LLM should do in

```python
'sentence -> sentiment: bool'
'document -> summary
'document -> summary'
```

This high-level abstraction allows you to focus on the desired behavior rather than the exact phrasing of the prompt. The power of signatures lies in their simplicity and adaptability, which make them an excellent tool for developers who want to interact with LLMs without needing to know the intricacies of prompt engineering. With DSPy, you can specify what you want—whether it’s summarizing a document or extracting sentiment—and let the framework handle the underlying complexity.
Expand Down
8 changes: 4 additions & 4 deletions website/learn/code/code-interpreter.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This section will describe some of the possible use cases.

The first approach to using code from and LLM is to run it to compute complex answers, we can accomplish this as follows:

```python
```python deploy
import hal9 as h9
import os
from openai import OpenAI
Expand All @@ -35,7 +35,7 @@ LLMs are not really that good at doing math; however, with the help of this code

We can take this a step further and make use of external data sources to compute even more complex questions. Let's use an spreadsheet file in CSV format to extract the headers, ask the LLM to compute a query that answers the user question, and run the results:

```python
```python deploy
import hal9 as h9
import os
from openai import OpenAI
Expand Down Expand Up @@ -69,7 +69,7 @@ Taking this approach even further, instead of relying on `print` to communicate

We will ask the LLM to generate a web application:

```python
```python deploy
import openai
import os
import hal9 as h9
Expand Down Expand Up @@ -101,7 +101,7 @@ h9.save("app.html", code)

Instead of building web applications, you can build data analytics apps that query databases and display charts as follows:

```python
```python deploy
import hal9 as h9
from openai import OpenAI

Expand Down
6 changes: 3 additions & 3 deletions website/learn/code/create.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ From an application development perspective, the simplest chat interface we can

For example, the following Python code generates a chatbot that replies "Echo" to whatever the input is. Arguably this is the simplest chatbot we can create.

```python
```python deploy
echo = input()
print(f"Echo: {echo}")
```
Expand All @@ -36,7 +36,7 @@ The stateful method is easy to implement by storing data in-memory, but it will

To make your chatbot behave correctly even after it restarts, we can store the conversation messages to files. You can use any library to store and load files, but we recommend the `hal9` package convenience functions to `save` and `load` files with ease:

```python
```python deploy
from openai import OpenAI
import hal9 as h9

Expand Down Expand Up @@ -73,7 +73,7 @@ Hal9 encourages to attach files in a chat as URL links. For example, users can r

To help assist with processing links and attachments, Hal9 provides an `input` function which extracts the text contents of a URL directly as text to easily support managing uploads.

```python
```python deploy
import hal9 as h9

contents = h9.input('Enter a URL: ')
Expand Down
4 changes: 2 additions & 2 deletions website/learn/code/deployment.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Beside chatbots, Hal9 can also deploy and run other content types like images, s

Web Applications (Web Apps) are applications that provide endpoints for us to use with a web browser (Chrome, Safari, Firefox, etc).

```python
```python deploy
import streamlit as st
import random

Expand All @@ -40,7 +40,7 @@ hal9 deploy webapp --type streamlit

Web APIs are applications that are designed for other computer programs or services to interoperate with, if you wanted to enable other web apps to use your previous app, you would do this as follows:

```python
```python deploy
from fastapi import FastAPI
import random

Expand Down
4 changes: 2 additions & 2 deletions website/learn/code/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ As easy as that you have created your first chatbot!
The code inside `/chatbot/app.py` contains a "Hello World" chatbot that reads the user prompt and echos the result back:


```python
```python deploy
prompt = input()
print(f"Echo: {prompt}")
```
Expand All @@ -55,7 +55,7 @@ hal9 create chatbot-groq --template groq

A template provides ready to use code with specific technologies and use cases. Is very popular to use OpenAI's ChatGPT-like template with `--template openai`, the code generated will look as follows:

```python
```python deploy
import hal9 as h9
from openai import OpenAI

Expand Down
6 changes: 3 additions & 3 deletions website/learn/code/llms.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ OPENAI_API_KEY=YOURAPIKEY python

Followed by the following code:

```python
```python deploy
import os
from openai import OpenAI

Expand All @@ -35,7 +35,7 @@ print(completion.choices[0].message.content)

However, the previous code will forget messages and take too long to display an answer. We can improve this using the memory and streaming concepts from the [building AIs](create.md) section:

```python
```python deploy
import hal9 as h9
from openai import OpenAI

Expand All @@ -62,7 +62,7 @@ h9.save("messages", messages, hidden = True)

You can also make use of any other LLM and even open source LLMs. The following example makes use of Groq and Meta's Llama LLM. One advantage of using Groq over OpenAI is that their system is optimized for speed, so expect this code to run much faster:

```python
```python deploy
import hal9 as h9
from groq import Groq

Expand Down
2 changes: 1 addition & 1 deletion website/learn/code/sd.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ To generate images, you can use [stable diffusion](../genai/sd.md) techniques wi

## Flux

```python
```python deploy
import os
import requests
import replicate
Expand Down
2 changes: 1 addition & 1 deletion website/learn/code/tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Hal9 simplifies the process of setting up tools with `describe()` which describe

The following code shows how to define a `calculate` function to help LLMs execute arithmetic operations, notice that the comment in the function is used as part of the description so it's imperative

```python
```python deploy
import hal9 as h9
from openai import OpenAI

Expand Down
4 changes: 2 additions & 2 deletions website/learn/code/websites.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ like javascript, css, svg that are referenced from index.html

YOu can make use of `h9.extract()` to extract the file blocks generated by the LLM, followed by `h9.save()` with the dictionary of text file names to content to store.

````py
````python
import hal9 as h9
import openai
import os
Expand Down Expand Up @@ -54,7 +54,7 @@ h9.save("index.html", files=files)

If you prefer to save the files without `h9.save()`, you can save them under a subfolder under `./storage`. In order to render this folder correctly, use an appropriate extension; for example, `html` for a website:

````py
````python
base_path = './.storage/website.index.html/'
os.makedirs(base_path, exist_ok=True)

Expand Down
3 changes: 2 additions & 1 deletion website/src/theme/CodeBlock/Content/String.js
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ export default function CodeBlockString({
});
const showLineNumbers =
showLineNumbersProp ?? containsLineNumbers(metastring);
const deploy = metastring?.includes('deploy');
return (
<Container
as="div"
Expand Down Expand Up @@ -94,7 +95,7 @@ export default function CodeBlockString({
isEnabled={wordWrap.isEnabled}
/>
)}
{language == 'python' ? (<DeployButton className={styles.codeButton} code={code} />) : (<CopyButton className={styles.codeButton} code={code} />)}
{deploy ? (<DeployButton className={styles.codeButton} code={code} />) : (<CopyButton className={styles.codeButton} code={code} />)}
</div>
</div>
</Container>
Expand Down
2 changes: 1 addition & 1 deletion website/src/theme/CodeBlock/DeployButton/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ export default function DeployButton({code, className}) {
const [isCopied, setIsCopied] = useState(false);
const copyTimeout = useRef(undefined);
const handleCopyCode = useCallback(() => {
window.open("/deploy?code=" + encodeURIComponent(code));
window.open("/devs?code=" + encodeURIComponent(code));
setIsCopied(true);
copyTimeout.current = window.setTimeout(() => {
setIsCopied(false);
Expand Down

0 comments on commit ccb472a

Please sign in to comment.