You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SUCCESS: Local Search Response:
根据提供的数据,关于唐晓芙的信息较为有限,但可以从相关记录中提取一些细节。唐晓芙在文本中与方鸿渐有互动,两人在社交场合中表现出一定的亲密关系。例如,方鸿渐曾邀请唐晓芙共进晚餐,并表现出对她的关心,询问她是否身体不适[^Data:Sources(17)]。此外,唐晓芙与苏小姐也有一定的关联,苏小姐曾提到唐晓芙的朋友很多,暗示她在社交圈中较为活跃[^Data:Sources(18)]。
接口返回
data: {"id":"chatcmpl-6c6079c8c97a4b4ca38e6b58975b4881","choices":[{"delta":{"content":"I am sorry but I am unable to answer this question given the provided data.","function_call":null,"refusal":null,"role":"assistant","tool_calls":null},"finish_re
ason":null,"index":0,"logprobs":null}],"created":1736973579,"model":"global","object":"chat.completion.chunk","service_tier":null,"system_fingerprint":null,"usage":null}
我们成功使用项目的中的围城完成了索引
当在控制台执行
graphrag query --method local --query "这是我表妹唐晓芙" --root .
输出
INFO: Vector Store Args: {
"type": "lancedb",
"db_uri": "F:\GraphRAGSERVER\output\lancedb",
"container_name": "==== REDACTED ====",
"overwrite": true
}
creating llm client with {'api_key': 'REDACTED,len=35', 'type': "openai_chat", 'encoding_model': 'cl100k_base', 'model': 'deepseek-chat', 'max_tokens': 4000, 'temperature': 0.0, 'top_p': 1.0, 'n': 1, 'frequency_penalty': 0.0, 'presence_penalty': 0.0, 'request_timeout': 180.0, 'api_base': 'https://api.deepseek.com/v1#http://localhost:1234/v1', 'api_version': None, 'organization': None, 'proxy': None, 'audience': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25, 'responses': None}
creating embedding llm client with {'api_key': 'REDACTED,len=35', 'type': "openai_embedding", 'encoding_model': 'cl100k_base', 'model': 'text-embedding-nomic-embed-text-v1.5', 'max_tokens': 4000, 'temperature': 0, 'top_p': 1, 'n': 1, 'frequency_penalty': 0.0, 'presence_penalty': 0.0, 'request_timeout': 180.0, 'api_base': 'http://localhost:1234/v1', 'api_version': None, 'organization': None, 'proxy': None, 'audience': None, 'deployment_name': None, 'model_supports_json': None, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 1, 'responses': None}
SUCCESS: Local Search Response:
根据提供的数据,关于唐晓芙的信息较为有限,但可以从相关记录中提取一些细节。唐晓芙在文本中与方鸿渐有互动,两人在社交场合中表现出一定的亲密关系。例如,方鸿渐曾邀请唐晓芙共进晚餐,并表现出对她的关心,询问她是否身体不适[^Data:Sources(17)]。此外,唐晓芙与苏小姐也有一定的关联,苏小姐曾提到唐晓芙的朋友很多,暗示她在社交圈中较为活跃[^Data:Sources(18)]。
然而,关于唐晓芙的背景、性格或更详细的行为描述,目前的数据并未提供足够的信息。如果需要更深入的了解,可能需要查阅更多相关的文本或资料。
(myenv) F:\GraphRAGSERVER>
我请求接口
$Uri="http://127.0.0.1:22222/v1/chat/completions"
$Form = @{
model= "local";
messages= @(@{role='user'; content='这是我表妹唐晓芙'});
"stream"= $false
}
$Result = Invoke-WebRequest -Uri $Uri -Method Post -Body (ConvertTo-Json -InputObject $Form) -Headers @{accept= 'application/json';'Content-Type'='application/json' }
$Result.Content
接口返回
data: {"id":"chatcmpl-6c6079c8c97a4b4ca38e6b58975b4881","choices":[{"delta":{"content":"I am sorry but I am unable to answer this question given the provided data.","function_call":null,"refusal":null,"role":"assistant","tool_calls":null},"finish_re
ason":null,"index":0,"logprobs":null}],"created":1736973579,"model":"global","object":"chat.completion.chunk","service_tier":null,"system_fingerprint":null,"usage":null}
data: {"id":"chatcmpl-6c6079c8c97a4b4ca38e6b58975b4881","choices":[{"delta":{"content":"","function_call":null,"refusal":null,"role":"assistant","tool_calls":null},"finish_reason":"stop","index":2,"logprobs":null}],"created":1736973579,"model":"glob
al","object":"chat.completion.chunk","service_tier":null,"system_fingerprint":null,"usage":null}
data: [DONE]
我不知道是不是我的索引出现了什么问题
The text was updated successfully, but these errors were encountered: