Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

May I use my local LLM (qwen2.5-7B in LM Studio) ? #263

Open
Asumer14 opened this issue Dec 11, 2024 · 9 comments
Open

May I use my local LLM (qwen2.5-7B in LM Studio) ? #263

Asumer14 opened this issue Dec 11, 2024 · 9 comments

Comments

@Asumer14
Copy link

Also I wanna rebuild the map change it to AI Family
I'm really interested in this project and I'm working on mine too.
Now I have some problem at building the map, so I want to have some inspiration and discussion in your project.

@chuanxin
Copy link

If you want to replace the default model, based on my experience, you can modify the LLM_CONFIG settings in the convex/util/llm.ts file in the project directory. For your needs, the modification will look something like this: chatModel: 'qwen2.5:7b' as const. You can specify other versions aside from 7b and also update the corresponding embeddingModel. I'm using 'znbang/bge:large-en-v1.5-q8_0'.

Additionally, I couldn't successfully set up a backend service for Convex locally. So, to use the local LLM, I set up a third-party tunnel service (e.g., ngrok) to connect to my localhost:11434.

@bruno686
Copy link

如果要修改替换默认模型,根据我的经验,可以修改LLM_CONFIG项目目录中凸/util/llm.ts文件中的设置。根据您的需要,将如下所示:chatModel: 'qwen2.5:7b' as const。您可以指定除 7b 之外的其他版本,并更新相应的embeddingModel。我正在使用'znbang/bge:large-en-v1.5-q8_0'

另外,我无法在本地成功设置 Convex 的端口服务。因此,为了使用本地 LLM,我设置了第三方隧道服务(例如 ngrok)来连接到我的 localhost:11434。

Did you run the project successfully? I stuck in 12/26/2024 4:36:40 PM [CONVEX (aiTown/agentOperations: agentGenerateMessage)] A Uncaught Error: The Request to http://127.0.0.1:11434/api/embeddings is forbidden. I use local ollama. It's really strange.

@Ycx2000
Copy link

Ycx2000 commented Jan 7, 2025

如果要修改替换默认模型,根据我的经验,可以修改LLM_CONFIG项目目录中凸/util/llm.ts文件中的设置。根据您的需要,将如下所示:chatModel: 'qwen2.5:7b' as const。您可以指定除 7b 之外的其他版本,并更新相应的embeddingModel。我正在使用'znbang/bge:large-en-v1.5-q8_0'
另外,我无法在本地成功设置 Convex 的端口服务。因此,为了使用本地 LLM,我设置了第三方隧道服务(例如 ngrok)来连接到我的 localhost:11434。

Did you run the project successfully? I stuck in 12/26/2024 4:36:40 PM [CONVEX (aiTown/agentOperations: agentGenerateMessage)] A Uncaught Error: The Request to http://127.0.0.1:11434/api/embeddings is forbidden. I use local ollama. It's really strange.

Hi, I'm having the same problem... do you have any solutions now?

@bruno686
Copy link

bruno686 commented Jan 7, 2025

如果要修改替换默认模型,根据我的经验,可以修改LLM_CONFIG项目目录中凸/util/llm.ts文件中的设置。根据您的需要,将如下所示:chatModel: 'qwen2.5:7b' as const。您可以指定除 7b 之外的其他版本,并更新相应的embeddingModel。我正在使用'znbang/bge:large-en-v1.5-q8_0'
另外,我无法在本地成功设置 Convex 的端口服务。因此,为了使用本地 LLM,我设置了第三方隧道服务(例如 ngrok)来连接到我的 localhost:11434。

Did you run the project successfully? I stuck in 12/26/2024 4:36:40 PM [CONVEX (aiTown/agentOperations: agentGenerateMessage)] A Uncaught Error: The Request to http://127.0.0.1:11434/api/embeddings is forbidden. I use local ollama. It's really strange.

Hi, I'm having the same problem... do you have any solutions now?

No, so sad, I can't run this project locally. But, when I use together.AI, I find it works

@Ycx2000
Copy link

Ycx2000 commented Jan 7, 2025

如果要修改替换默认模型,根据我的经验,可以修改LLM_CONFIG项目目录中凸/util/llm.ts文件中的设置。根据您的需要,将如下所示:chatModel: 'qwen2.5:7b' as const。您可以指定除 7b 之外的其他版本,并更新相应的embeddingModel。我正在使用'znbang/bge:large-en-v1.5-q8_0'
另外,我无法在本地成功设置 Convex 的端口服务。因此,为了使用本地 LLM,我设置了第三方隧道服务(例如 ngrok)来连接到我的 localhost:11434。

Did you run the project successfully? I stuck in 12/26/2024 4:36:40 PM [CONVEX (aiTown/agentOperations: agentGenerateMessage)] A Uncaught Error: The Request to http://127.0.0.1:11434/api/embeddings is forbidden. I use local ollama. It's really strange.

Hi, I'm having the same problem... do you have any solutions now?

No, so sad, I can't run this project locally. But, when I use together.AI, I find it works

Thanks for your reply. I haven't found anyone else mentioning this problem except you. Could it be related to the fact that we are in China? Maybe some kind of internet issue... I'm not sure.

@Ycx2000
Copy link

Ycx2000 commented Jan 7, 2025

如果要修改替换默认模型,根据我的经验,可以修改LLM_CONFIG项目目录中凸/util/llm.ts文件中的设置。根据您的需要,将如下所示:chatModel: 'qwen2.5:7b' as const。您可以指定除 7b 之外的其他版本,并更新相应的embeddingModel。我正在使用'znbang/bge:large-en-v1.5-q8_0'
另外,我无法在本地成功设置 Convex 的端口服务。因此,为了使用本地 LLM,我设置了第三方隧道服务(例如 ngrok)来连接到我的 localhost:11434。

Did you run the project successfully? I stuck in 12/26/2024 4:36:40 PM [CONVEX (aiTown/agentOperations: agentGenerateMessage)] A Uncaught Error: The Request to http://127.0.0.1:11434/api/embeddings is forbidden. I use local ollama. It's really strange.

Hi, I'm having the same problem... do you have any solutions now?

No, so sad, I can't run this project locally. But, when I use together.AI, I find it works

Thanks for your reply. I haven't found anyone else mentioning this problem except you. Could it be related to the fact that we are in China? Maybe some kind of internet issue... I'm not sure.

But together.AI works for me too. Thanks.

@bruno686
Copy link

bruno686 commented Jan 7, 2025 via email

@bruno686
Copy link

bruno686 commented Jan 7, 2025 via email

@Ycx2000
Copy link

Ycx2000 commented Jan 7, 2025

Yes hhh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants