Replies: 1 comment
-
Thanks for opening your first issue here! Be sure to follow the issue template! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Apache Airflow version: 1.10.10
Environment:
Linux RHEL , Python 3.6 with Celery worker and Mysql/Redis.
What happened:
I am facing issues when trying to use Reschedule with the http_sensor.
The following works fine without issues.
If I just add reschedule to this, I see this just run into a continuous retry loop.
The above one just runs into an indefinite loop even after the actual lambda response_check is successful and following is the log I am seeing
On the Scheduler side I am seeing the following
2020-07-31 18:40:32,287 ERROR - Executor reports task instance <TaskInstance: simplelineardagnovar185.http_sensor_check_1 2020-07-30 00:42:00+00:00 [queued]> finished (success) although the task says its queued. Was the task killed externally?
What you expected to happen:
The task should be rescheduled and pass when the condition is met.
How to reproduce it:
Use http sensor with "mode=reschedule" to reproduce the issue.
Anything else we need to know:
PFB the complete DAG
My Dag is quite simple.
The POST_PRODUCTS will post an async request and that will takes ~5 min to get committed to DB and the http_sensor_check just do the get to identify whether the POST is successful.
Beta Was this translation helpful? Give feedback.
All reactions