You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenAI Didnt Accept ChatMessage Content with Null value, only Accept String.
This issue happened specially when Agent Tools return None or empy string, such as RetrieverTool with Empty Result
Version
0.12.9 to 0.12.11 (llama-index-llms-openai 0.3.10 to 0.3.13)
Steps to Reproduce
Create Agent With Retriever Tool
Add Postprocessors to filter out nodes (I Use LLM Rerank)
Adjust filter so that you get 0 Nodes in the end.
Relevant Logs/Tracbacks
Traceback (most recent call last):
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 367, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\chat_engine\types.py", line 234, in awrite_response_to_history
async forchatin self.achat_stream:
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\llms\callbacks.py", line 88, in wrapped_gen
async forxin f_return_val:
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\llms\openai\base.py", line 724, in gen
async forresponsein await aclient.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\_base_client.py", line 1849, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\_base_client.py", line 1543, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\_base_client.py", line 1644, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null.", 'type': 'invalid_request_error', 'param': 'messages.[3].content', 'code': None}}
INFO: 127.0.0.1:51858 - "POST /query HTTP/1.1" 200 OK
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 409, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\fastapi\applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\applications.py", line 113, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\errors.py", line 187, in __call__
raise exc
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\errors.py", line 165, in __call__
await self.app(scope, receive, _send)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\base.py", line 185, in __call__
with collapse_excgroups():
^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python312\Lib\contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\_utils.py", line 82, in collapse_excgroups
raise exc
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\base.py", line 188, in __call__
await response(scope, wrapped_receive, send)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\base.py", line 222, in __call__
async forchunkin self.body_iterator:
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\base.py", line 179, in body_stream
raise app_exc
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\base.py", line 149, in coro
await self.app(scope, receive_or_disconnect, send_no_error)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\middleware\base.py", line 185, in __call__
with collapse_excgroups():
^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python312\Lib\contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\_utils.py", line 82, in collapse_excgroups
raise exc
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\responses.py", line 255, in wrap
await func()
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\starlette\responses.py", line 244, in stream_response
async forchunkin self.body_iterator:
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\app\index_server.py", line 295, in gen
async fortokeninmy_query.async_response_gen():
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\chat_engine\types.py", line 310, in async_response_gen
raise self.exception
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 367, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\chat_engine\types.py", line 234, in awrite_response_to_history
async forchatin self.achat_stream:
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\core\llms\callbacks.py", line 88, in wrapped_gen
async forxin f_return_val:
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\llama_index\llms\openai\base.py", line 724, in gen
async forresponsein await aclient.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\_base_client.py", line 1849, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\_base_client.py", line 1543, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\habi6\Desktop\PROJECT\aidoc-llm-service\.venv\Lib\site-packages\openai\_base_client.py", line 1644, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null.", 'type': 'invalid_request_error', 'param': 'messages.[3].content', 'code': None}}
The text was updated successfully, but these errors were encountered:
To resolve the openai.BadRequestError with error code 400 due to an invalid value for 'content' when using the RetrieverTool with an empty result in version 0.12.11 of llama-index-llms-openai, ensure that the content field is not set to an empty string. You can handle this by setting content_txt to None if it is an empty string:
content_txt=Noneifcontent_txt==""elsecontent_txt
This line ensures that if the content_txt is empty, it is set to None instead of an empty string, which prevents the error when sending the message to OpenAI [1].
Bug Description
OpenAI Didnt Accept ChatMessage Content with Null value, only Accept String.
This issue happened specially when Agent Tools return None or empy string, such as RetrieverTool with Empty Result
Version
0.12.9 to 0.12.11 (llama-index-llms-openai 0.3.10 to 0.3.13)
Steps to Reproduce
Create Agent With Retriever Tool
Add Postprocessors to filter out nodes (I Use LLM Rerank)
Adjust filter so that you get 0 Nodes in the end.
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: