You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Problem Description
I am using ollama to call qwen2.5:7b and found that in the case of long texts, the context tokens seem to be limited to 2000-4000 tokens. Even setting the maximum conversation count to "unlimited" does not work. After consulting the official ollama documentation, I discovered that by passing the option ctx_num: 32000, the context length of qwen2.5:7b can be increased.
Proposed Solution
In the model configuration of ollama, a new field ctx_num has been added to facilitate users in modifying the context length.
Problem Description
I am using ollama to call qwen2.5:7b and found that in the case of long texts, the context tokens seem to be limited to 2000-4000 tokens. Even setting the maximum conversation count to "unlimited" does not work. After consulting the official ollama documentation, I discovered that by passing the option ctx_num: 32000, the context length of qwen2.5:7b can be increased.
Proposed Solution
In the model configuration of ollama, a new field
ctx_num
has been added to facilitate users in modifying the context length.问题描述
我在使用ollama调用qwen2.5:7b,发现长文本的情况下,上下文token似乎只有2000-4000token。即使把最大对话数量设为“无限制”也不起作用。查阅ollama官方文档,发现可以通过传入ctx_num: 32000 这个options,把qwen2.5:7b的上下文长度加大。
解决思路
在ollama的模型配置中,新增ctx_num字段,方便用户修改。
The text was updated successfully, but these errors were encountered: