-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pandasai was working with watsonx till a few days back... again started getting the error "pandasai.exceptions.NoResultFoundError: No result returned" #1420
Comments
Hey there, @raviisrani! I'm here to help you with any bugs or questions you have while waiting for a human maintainer. Let's squash those bugs together! The In your case, this error might be related to the handling of LLM responses by the Here are a few steps you can take to troubleshoot and potentially resolve the issue:
If these steps do not resolve the issue, you might want to look into any recent updates or discussions related to similar issues in the PandasAI repository. |
My responses below..
Ravi - Tried testing with other models on watsxonx platform.. same results
Ravi - The same credentials and API key is working on other projects
Ravi - Prompt works perfect with Gemini flash.. in fact same was working till a few days back
Ravi - No changes in my local conda environment.. have not updated the python libraries
Ravi - Tried the same code, data set and prompt on gemini flash code .. works perfect! Is there a way to debug pandasai classes, like resultvalidation classes to see what is the cause of this bug? Can you assign someone to fix it.. have been having this issue with pandasai library with mainly watsonx.. works great with gemini |
System Info
Python : 3.12.4
pandasai : 2.2.14
ibm_watsonx_ai : 0.2.6
🐛 Describe the bug
from pandasai import SmartDataframe
import pandas as pd
from pandasai.llm import IBMwatsonx
textwx.csv uploaded to githib for testing
#textwx.csv
#Create the prompt variable
First_prompt = "What is the average margin for 22?"
print ("\nPrompt first: ", First_prompt)
initialize watsonx llm
parameters1 = {
"decoding_method": "greedy",
"max_new_tokens": 1000,
"min_new_tokens": 1,
}
#model="ibm/granite-13b-chat-v2"
#model="mistralai/mixtral-8x7b-instruct-v01"
#model="codellama/codellama-34b-instruct-hf"
#model="google/flan-t5-xl"
llm = IBMwatsonx(
api_key = '0KX97ceUcTdzkUNGOxH_-RQgAydygoveSehxrLTHuqvd',
watsonx_url = "https://jp-tok.ml.cloud.ibm.com",
watsonx_project_id = '831054fa-285f-4d3d-a53f-ada70e8ac329',
model="meta-llama/llama-3-1-8b-instruct"
)
Supported wx models
#'codellama/codellama-34b-instruct-hf', 'elyza/elyza-japanese-llama-2-7b-instruct',
'google/flan-t5-xl', 'google/flan-t5-xxl', 'google/flan-ul2', 'ibm/granite-13b-chat-v2',
'ibm/granite-13b-instruct-v2', 'ibm/granite-20b-multilingual', 'ibm/granite-7b-lab',
'ibm/granite-8b-japanese', 'meta-llama/llama-3-1-70b-instruct', 'meta-llama/llama-3-1-8b-instruct',
'meta-llama/llama-3-70b-instruct', 'meta-llama/llama-3-8b-instruct',
'mistralai/mixtral-8x7b-instruct-v01', 'mncai/llama2-13b-dpo-v7'
Create a Smart Data Frame
df1 = pd.read_csv("textwx.csv")
df = SmartDataframe(df1, config={"llm": llm})
print("\nCreated smart data frame with PandasAI")
print("\nAverage margin : ",df.chat(First_prompt))
The text was updated successfully, but these errors were encountered: