Skip to content

Commit

Permalink
feat: add ollama chat example
Browse files Browse the repository at this point in the history
Signed-off-by: yuluo-yx <[email protected]>
  • Loading branch information
yuluo-yx committed Dec 24, 2024
1 parent 8983563 commit ece9f4f
Show file tree
Hide file tree
Showing 20 changed files with 620 additions and 1 deletion.
2 changes: 2 additions & 0 deletions docker-compose/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# ollama models
ollama/models/*
11 changes: 11 additions & 0 deletions docker-compose/es/config/es.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
cluster.name: docker-es
node.name: es-node-1
network.host: 0.0.0.0
network.publish_host: 0.0.0.0
http.port: 9200
http.cors.enabled: true
http.cors.allow-origin: "*"
bootstrap.memory_lock: true

# 关闭认证授权 es 8.x 默认开启
xpack.security.enabled: false
24 changes: 24 additions & 0 deletions docker-compose/es/docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
version: '3.8'

services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.16.1
container_name: elasticsearch
privileged: true
environment:
- "cluster.name=elasticsearch"
- "discovery.type=single-node"
- "ES_JAVA_OPTS=-Xms512m -Xmx1096m"
- bootstrap.memory_lock=true
volumes:
- ./config/es.yaml:/usr/share/elasticsearch/config/elasticsearch.yml
ports:
- "9200:9200"
- "9300:9300"
deploy:
resources:
limits:
cpus: "2"
memory: 1000M
reservations:
memory: 200M
Empty file added docker-compose/mysql/.keep
Empty file.
110 changes: 110 additions & 0 deletions docker-compose/ollama/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
# 部署 ollama3 并安装模型

> 部署中为了尽可能减少对本地环境的污染,使用 Docker 安装!
github: https://github.com/ollama/ollama

## 准备部署文件

```yml
version: '3.8'

services:
ollama:
volumes:
# 如果需要可以打开:将本地文件夹挂载到容器中的 /root/.ollama 目录 (模型下载位置)
- ./models:/root/.ollama
container_name: spring-ai-alibaba-ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama:latest
ports:
- 11434:11434 # Ollama API 端口
```
之后在此根目录下使用 `docker compose up -d` 拉起容器,等待一段时间之后,docker images pull 成功。即可执行下一步。

## 下载 LLM 模型

LLM 模型参考:

| **Model** | **Parameters** | **Size** | **Download** |
| ------------------ | -------------- | -------- | ------------------------------ |
| Llama 3 | 8B | 4.7GB | `ollama run llama3` |
| qwen | 4b | 2.3G | `ollama run qwen:4b` |
| Llama 3 | 70B | 40GB | `ollama run llama3:70b` |
| Phi-3 | 3,8B | 2.3GB | `ollama run phi3` |
| Mistral | 7B | 4.1GB | `ollama run mistral` |
| Neural Chat | 7B | 4.1GB | `ollama run neural-chat` |
| Starling | 7B | 4.1GB | `ollama run starling-lm` |
| Code Llama | 7B | 3.8GB | `ollama run codellama` |
| Llama 2 Uncensored | 7B | 3.8GB | `ollama run llama2-uncensored` |
| LLaVA | 7B | 4.5GB | `ollama run llava` |
| Gemma | 2B | 1.4GB | `ollama run gemma:2b` |
| Gemma | 7B | 4.8GB | `ollama run gemma:7b` |
| Solar | 10.7B | 6.1GB | `ollama run solar` |

这里选择最小体积且最好用的模型: llama3:4b 模型。

```shell
docker exec -it spring-ai-alibaba-ollama bash
# 进入容器执行
ollama run llama3
```

成功之后会看到下面这样:

```shell
root@c5e5ff20a533:/# ollama run llama3
pulling manifest
pulling 6a0746a1ec1a... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 4.7 GB
pulling 4fa551d4f938... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 12 KB
pulling 8ab4849b038c... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 254 B
pulling 577073ffcc6c... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 110 B
pulling 3f8eb4da87fa... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 485 B
verifying sha256 digest
writing manifest
removing any unused layers
success
>>> 你好
💖 你好!我很高兴地看到你的消息! 😊
>>> 你能介绍下自己吗
😊 I'd be happy to introduce myself.
My name is LLaMA, and I'm a large language model trained by Meta AI. I'm a computer program designed to understand and generate human-like text, so we can have
conversations like this one! 🤖
I was trained on a massive dataset of text from the internet, which allows me to learn about various topics, including history, science, culture, and more. This
training enables me to answer questions, provide information, and even engage in creative writing or storytelling.
As a conversational AI, my goal is to assist and entertain users like you. I'm designed to be helpful, friendly, and respectful, so please feel free to ask me
anything or share your thoughts with me! 💬
```

## 访问

上面已经介绍了一种访问方式,通过 run 的方式。下面介绍 api 方式访问。

### API

```shell
curl http://localhost:11434/api/generate -d '{
"model":"llama3",
"prompt": "请分别翻译成中文、韩文、日文 -> Meta Llama 3: The most capable openly available LLM to date",
"stream": false
}'
curl http://localhost:11434/api/chat -d '{
"model": "llama3",
"messages": [
{
"role": "user",
"content": "why is the sky blue?"
}
],
"stream": true
}'
```
13 changes: 13 additions & 0 deletions docker-compose/ollama/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
version: '3.8'

services:
ollama:
volumes:
- ./models:/root/.ollama # 将本地文件夹挂载到容器中的 /root/.ollama 目录 (模型下载位置)
container_name: spring-ai-alibaba-ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama:latest
ports:
- "11434:11434" # Ollama API 端口
Empty file added docker-compose/redis/.keep
Empty file.
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Spring AI Alibaba Ollama Chat Client Example

本示例主要演示如何使用 Spring AI 对接 Ollama。

在运行 example 之前,您应该先配置 [Ollama](../../../docker-compose/ollama/README.md) 的环境。

更多使用方式您可以参考:[Spring AI Ollama](https://docs.spring.io/spring-ai/reference/api/chat/ollama-chat.html)
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
<?xml version="1.0" encoding="UTF-8"?>

<!--
Copyright 2023-2024 the original author or authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->

<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.alibaba.cloud.ai</groupId>
<artifactId>ollama-chat</artifactId>
<version>${revision}</version>
<relativePath>../pom.xml</relativePath>
</parent>

<artifactId>ollama-chat-client</artifactId>
<version>${revision}</version>

<description>Spring AI Alibaba ollama Chat Client Example</description>
<name>Spring AI Alibaba ollama Chat Client Examples</name>

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring-boot.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>${maven-deploy-plugin.version}</version>
</plugin>
</plugins>
</build>

</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.alibaba.cloud.ai.example.chat.openai;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

/**
* @author yuluo
* @author <a href="mailto:[email protected]">yuluo</a>
*/

@SpringBootApplication
public class OllamaChatClientApplication {

public static void main(String[] args) {

SpringApplication.run(OllamaChatClientApplication.class, args);
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.alibaba.cloud.ai.example.chat.openai.controller;

import jakarta.servlet.http.HttpServletResponse;
import reactor.core.publisher.Flux;

import org.springframework.ai.chat.client.ChatClient;
import org.springframework.ai.chat.client.advisor.MessageChatMemoryAdvisor;
import org.springframework.ai.chat.client.advisor.SimpleLoggerAdvisor;
import org.springframework.ai.chat.memory.InMemoryChatMemory;
import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.ollama.api.OllamaOptions;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

/**
* @author yuluo
* @author <a href="mailto:[email protected]">yuluo</a>
*/

@RestController
@RequestMapping("/ollama/chat-client")
public class OllamaClientController {

private static final String DEFAULT_PROMPT = "你好,介绍下你自己!请用中文回答。";

private final ChatClient ollamaiChatClient;

private final ChatModel ollamaiChatModel;

public OllamaClientController(ChatModel chatModel) {

this.ollamaiChatModel = chatModel;

// 构造时,可以设置 ChatClient 的参数
// {@link org.springframework.ai.chat.client.ChatClient};
this.ollamaiChatClient = ChatClient.builder(chatModel)
// 实现 Chat Memory 的 Advisor
// 在使用 Chat Memory 时,需要指定对话 ID,以便 Spring AI 处理上下文。
.defaultAdvisors(
new MessageChatMemoryAdvisor(new InMemoryChatMemory())
)
// 实现 Logger 的 Advisor
.defaultAdvisors(
new SimpleLoggerAdvisor()
)
// 设置 ChatClient 中 ChatModel 的 Options 参数
.defaultOptions(
OllamaOptions.builder()
.withTopP(0.7)
.withModel("llama3")
.build()
)
.build();
}

/**
* ChatClient 简单调用
*/
@GetMapping("/simple/chat")
public String simpleChat() {

return ollamaiChatClient.prompt(DEFAULT_PROMPT).call().content();
}

/**
* ChatClient 流式调用
*/
@GetMapping("/stream/chat")
public Flux<String> streamChat(HttpServletResponse response) {

response.setCharacterEncoding("UTF-8");
return ollamaiChatClient.prompt(DEFAULT_PROMPT).stream().content();
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
server:
port: 10006

spring:
application:
name: spring-ai-alibaba-ollama-chat-client-example

ai:
ollama:
base-url: http://localhost:11434
chat:
model: llama3
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Spring AI Alibaba Ollama Chat Model Example

本示例主要演示如何使用 Spring AI 对接 Ollama。

在运行 example 之前,您应该先配置 [Ollama](../../../docker-compose/ollama/README.md) 的环境。

更多使用方式您可以参考:[Spring AI Ollama](https://docs.spring.io/spring-ai/reference/api/chat/ollama-chat.html)
Loading

0 comments on commit ece9f4f

Please sign in to comment.