Replies: 5 comments
-
Thanks for opening your first issue here! Be sure to follow the issue template! |
Beta Was this translation helpful? Give feedback.
-
Hello,
|
Beta Was this translation helpful? Give feedback.
-
@jnunezgts Can I assign you to this report? I am happy to help with the review. |
Beta Was this translation helpful? Give feedback.
-
Hello, |
Beta Was this translation helpful? Give feedback.
-
Don't worry. Be happy. We will definitely help you. We also have a special channel for new contributors - #airflow-how-to-pr ( |
Beta Was this translation helpful? Give feedback.
-
Apache Airflow version:
1.10.12, using SQLLite as the backend
Kubernetes version (if you are using kubernetes) (use
kubectl version
):N/A. Using Docker Swarm 19.03.8
Environment:
No cloud, bare-metal server:
uname -a
):Docker info:
What happened:
Created the following DAG to schedule a one time shot job:
What you expected to happen:
I was expecting the container to be created and be alive for 60 seconds, exit with code=0 after that. No ouput.
Other have reported success in the past using Docker Swarm Operator.
Not sure. The Airflow log shows the following:
I can run this command from docker CLI as follows:
How to reproduce it:
Anything else we need to know:
Airflow.log
*** Reading local file: /home/user/airflow/logs/avt_24_7_box/SLEEP_TASK/2020-10-17T13:24:26.101897+00:00/2.log [2020-10-17 09:46:58,312] {taskinstance.py:670} INFO - Dependencies all met for [2020-10-17 09:46:58,321] {taskinstance.py:670} INFO - Dependencies all met for [2020-10-17 09:46:58,321] {taskinstance.py:880} INFO - -------------------------------------------------------------------------------- [2020-10-17 09:46:58,321] {taskinstance.py:881} INFO - Starting attempt 2 of 2 [2020-10-17 09:46:58,321] {taskinstance.py:882} INFO - -------------------------------------------------------------------------------- [2020-10-17 09:46:58,328] {taskinstance.py:901} INFO - Executing on 2020-10-17T13:24:26.101897+00:00 [2020-10-17 09:46:58,335] {standard_task_runner.py:54} INFO - Started process 35637 to run task [2020-10-17 09:46:58,371] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', '24_7_box', 'SLEEP_TASK', '2020-10-17T13:24:26.101897+00:00', '--job_id', '55', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/avt_test3.py', '--cfg_path', '/tmp/tmpaivrdhuu'] [2020-10-17 09:46:58,372] {standard_task_runner.py:78} INFO - Job 55: Subtask SLEEP_TASK [2020-10-17 09:46:58,398] {logging_mixin.py:112} INFO - Running %s on host %s server.company.com [2020-10-17 09:46:58,467] {docker_swarm_operator.py:105} INFO - Starting docker service from image fedora:29 [2020-10-17 09:46:58,475] {taskinstance.py:1150} ERROR - 400 Client Error: Bad Request ("json: cannot unmarshal string into Go struct field Resources.MemoryBytes of type int64") Traceback (most recent call last): File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/docker/api/client.py", line 259, in _raise_for_status response.raise_for_status() File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/requests/models.py", line 941, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: http+docker://localhost/v1.40/services/createDuring handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/airflow/models/taskinstance.py", line 984, in run_raw_task
result = task_copy.execute(context=context)
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/airflow/operators/docker_operator.py", line 277, in execute
return self.run_image()
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/airflow/contrib/operators/docker_swarm_operator.py", line 119, in run_image
labels={'name': 'airflow%s_%s' % (self.dag_id, self.task_id)}
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/docker/utils/decorators.py", line 34, in wrapper
return f(self, *args, **kwargs)
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/docker/api/service.py", line 190, in create_service
self._post_json(url, data=data, headers=headers), True
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/docker/api/client.py", line 265, in _result
self._raise_for_status(response)
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/docker/api/client.py", line 261, in _raise_for_status
raise create_api_error_from_http_exception(e)
File "/home/user/virtualenv/airflow/lib64/python3.7/site-packages/docker/errors.py", line 31, in create_api_error_from_http_exception
raise cls(e, response=response, explanation=explanation)
docker.errors.APIError: 400 Client Error: Bad Request ("json: cannot unmarshal string into Go struct field Resources.MemoryBytes of type int64")
[2020-10-17 09:46:58,481] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=24_7_box, task_id=SLEEP_TASK, execution_date=20201017T132426, start_date=20201017T134658, end_date=20201017T134658
[2020-10-17 09:46:58,509] {configuration.py:373} WARNING - section/key [smtp/smtp_user] not found in config
[2020-10-17 09:46:58,583] {email.py:132} INFO - Sent an alert email to ['[email protected]']
[2020-10-17 09:47:03,312] {local_task_job.py:102} INFO - Task exited with return code 1
Beta Was this translation helpful? Give feedback.
All reactions