Sensor returns success state when fatal executor exception #43360
Replies: 2 comments 1 reply
-
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
Beta Was this translation helpful? Give feedback.
-
I think this is a low-level failure - becuase there was not enough memory for example (because apparently just starting a new process faile. When you run out of memory you cannot do much other than failing things - and likely airflow will not be able to do anything .. because it will have no memory to do anything But it's hard to say - since it is not reproducible, and no obvious solution, we can at most convert it into discussion and maybe someone will have similar issue and more details/reproducible case. |
Beta Was this translation helpful? Give feedback.
-
Apache Airflow version
Other Airflow 2 version (please specify below)
If "Other Airflow 2 version" selected, which one?
2.9.3
What happened?
Execution of a sensor that pokes every 5 minutes failed within a celery worker with the following error message:
The scheduler errors with
As a result, the task was marked as success and downstream tasks were executed.
What you think should happen instead?
The sensor's conditional was not
true
so running downstream tasks incorrectly run.If the execution of the task failed, I would expect airflow to mark the task instance as failed and not success.
How to reproduce
Unfortunately, I haven't found an easy way to reproduce the issue
Operating System
Debian GNU/Linux 12 (bookworm)
Versions of Apache Airflow Providers
No response
Deployment
Official Apache Airflow Helm Chart
Deployment details
Running on kubernetes cluster using celery executors with redis.
Using a postgres database for metadata
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions