Unable to Run Playbooks with Xata Agent – Assistance Required #220
-
|
Hello Xata Team, I’ve successfully installed the Xata Agent on one of our development PostgreSQL servers. The agent appears to be connecting to the database and collecting relevant information as expected. However, I’m currently facing an issue when attempting to run playbooks — they fail to execute and throw an error. I’ve reviewed the Docker logs and noticed the following error message: xata-agent-1 | responseBody: '{\n' + Could you please assist in identifying the cause and guiding me on how to resolve this? Best regards, |
Beta Was this translation helpful? Give feedback.
Replies: 28 comments 5 replies
-
|
I am using opensource xata agent |
Beta Was this translation helpful? Give feedback.
-
|
Hello @Karthikr2212, thanks for the detailed report and log excerpt. The error in your Docker logs shows that the Xata Agent is trying to run playbooks requiring OpenAI access, but your account is out of quota/credits. Please check your OpenAI account usage and quota status. If exceeded, you’ll need to wait for reset or upgrade your plan. Also confirm that If you’ve restored quota or added a valid key and still see the issue, let us know and we’ll assist further. |
Beta Was this translation helpful? Give feedback.
-
|
Hello Xata Team, Is the OPENAI_API_KEY required in env.production? I used a test key, but I'm seeing the error: "You exceeded your current quota, please check your plan and billing details." It seems like I need a paid API key. Is there any way to run the playbooks using a test key? Regards, |
Beta Was this translation helpful? Give feedback.
-
|
Hello @Karthikr2212, you shouldn’t need an OpenAI API key for using Agent. Could you please share which Agent version you’re using? |
Beta Was this translation helpful? Give feedback.
-
|
Can we use this agent without open API key ? |
Beta Was this translation helpful? Give feedback.
-
Yes. |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
Glad to hear it, let us know if there are any problems! |
Beta Was this translation helpful? Give feedback.
-
|
Hi @gulcin , I am using xataio/agent:0.3.1 version, I am able to connect DB ,but unable to use playbooks. It throwing errors |
Beta Was this translation helpful? Give feedback.
-
|
without API key i havent seen playbooks and chat option. Getting below error in docker messages and getting 500 | internal server Error: No providers enabled. Please configure API keys |
Beta Was this translation helpful? Give feedback.
-
|
Thanks for reaching out again, the error you mention comes from this code path here. Please double check that the names of the env vars are correct (typo or anything?) and if they are being loaded into docker correctly.
If the problem persists, please share reproduction steps of your setup. We will try to then reproduce it on our end. |
Beta Was this translation helpful? Give feedback.
-
|
PUBLIC_URL=http://10.x.x.x:8080 i used above fields in .env.production and got the below error [✓] migrations applied successfully!Starting scheduler with 60s interval (60000ms) |
Beta Was this translation helpful? Give feedback.
-
|
if the above configuration is not correct, just give me the sample configuration file |
Beta Was this translation helpful? Give feedback.
-
|
Ah, you need to add one of the AI provider API keys, so one of |
Beta Was this translation helpful? Give feedback.
-
|
Yes thats what i am saying without API key set playbooks and chat option not available |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
|
it means only we can use to connect database and collecting environment variables information only ? |
Beta Was this translation helpful? Give feedback.
-
|
Yes, there has to be at least one AI provider for the chat to work, the chat is powered by LLMs, without it, it does not work. Sorry about the confusion earlier. |
Beta Was this translation helpful? Give feedback.
-
|
Ok so without API key it doesn't work :( I was getting be same error. |
Beta Was this translation helpful? Give feedback.
-
|
Can we get something to use the offline models like llama 3.2 until we purchase the keys? Thanks in advance. |
Beta Was this translation helpful? Give feedback.
-
|
Hi Team, Can we use our own organization internal api keys(https://testapi.copart.com/api), If that works please guide me how to implement |
Beta Was this translation helpful? Give feedback.
-
|
Yes, it works with local LLMs, agent/apps/dbagent/src/lib/env/server.ts Lines 29 to 30 in 30aecbc You can provide |
Beta Was this translation helpful? Give feedback.
-
what's the setup like, that URL is not publicly accessible, maybe there is a way to make it work with LiteLLM which agent support but I am not yet sure on what is on your side of API. |
Beta Was this translation helpful? Give feedback.
-
|
Can you please give me a example that what i have to update |
Beta Was this translation helpful? Give feedback.
-
It is a private URL, likely only accessible behind a VPN. |
Beta Was this translation helpful? Give feedback.
-
|
Actually this is my organization internal one(we can get api keys)...so it cannot be access outside So can you please let me know what xchanges need to be done in below file
|
Beta Was this translation helpful? Give feedback.
-
|
Each model uses a different API, depending on which one is at your org, you would need to choose one of Ollama or LiteLLM. I do not have enough information to help you without knowing what flavor of AI API is offered internally at your org. Maybe try asking ChatGPT or someone at your org who has setup the internal one? |
Beta Was this translation helpful? Give feedback.
-
|
To summarize, we need at least one AI model to make the chat work. It doesn't need to be OpenAI but can be Anthropic or something else. Conceptually, there are 3 types of models
When one of them is available the agent chat is functional. |
Beta Was this translation helpful? Give feedback.




To summarize, we need at least one AI model to make the chat work. It doesn't need to be OpenAI but can be Anthropic or something else. Conceptually, there are 3 types of models
ollama- self hosted models viaollamaWhen one of them is available the agent chat is functional.