Skip to content

DAGs / Tasks fail to run when corporate SSL cert is enabled for API server #55147

@emredjan

Description

@emredjan

Apache Airflow version

3.0.6

If "Other Airflow 2 version" selected, which one?

No response

What happened?

We're using intermediate certificates in our corporate environment, signed with our corporate CA. We were succesfully using them with Airflow 2.x with the web server configuration using AIRFLOW__WEBSERVER__WEB_SERVER_SSL_CERT and AIRFLOW__WEBSERVER__WEB_SERVER_SSL_KEY options. When migrated to Airflow 3, we changed the configuration items to AIRFLOW__API__SSL_CERT and AIRFLOW__API__SSL_KEY, set our AIRFLOW__API__BASE_URL as https://<domain>:<port>. With this configuration, none of the dags/tasks run and they either get stuck as queued, or fail with errors:

ERROR - DAG 'test_executor' for task instance <TaskInstance: test_executor.do_something manual__2025-09-01T14:35:32.346203+00:00 [queued]> not found in serialized_dag table
ERROR - Executor CeleryExecutor(parallelism=16) reported that the task instance <TaskInstance: test_executor.do_something manual__2025-09-01T14:35:32.346203+00:00 [queued]> finished with state failed, but the task instance's state attribute is queued. Learn more: https://airflow.apache.org/docs/apache-airflow/stable/troubleshooting.html#task-state-changed-externally

Changing to LocalExecutor have the same results.

When removing AIRFLOW__API__SSL_CERT and AIRFLOW__API__SSL_KEY, and setting AIRFLOW__API__BASE_URL as http://<domain>:<port>, everything works fine (albeit without SSL):

INFO - Marking run <DagRun test_executor @ 2025-09-01 14:38:09.656000+00:00: manual__2025-09-01T14:38:10.519908+00:00, state:running, queued_at: 2025-09-01 14:38:10.525105+00:00. run_type: manual> successful
INFO - DagRun Finished: dag_id=test_executor, logical_date=2025-09-01 14:38:09.656000+00:00, run_id=manual__2025-09-01T14:38:10.519908+00:00, run_start_date=2025-09-01 14:38:10.699026+00:00, run_end_date=2025-09-01 14:38:14.555043+00:00, run_duration=3.856017, state=success, run_type=manual, data_interval_start=2025-09-01 14:38:09.656000+00:00, data_interval_end=2025-09-01 14:38:09.656000+00:00,

The same certificate and configuration runs also totally fine with SSL enabled on 2.x. I'm assuming this is beacuase of the change to a generic API server on v3, which workers also use to communicate, and somehow the SSL configuration makes it break. This part of the documentation can be a solution: https://airflow.apache.org/docs/apache-airflow/stable/howto/run-with-self-signed-certificate.html#generating-the-certificate but as listed there, this is not very secure for production usage. Disabling SSL also removes SSL from the web ui, which is not acceptable as well.

What you think should happen instead?

Airflow simply should use the supplied base url for all of the API-Worker communication and shouldn't rely on subject alternate names being present in the certificates. It should simply work with AIRFLOW__API__SSL_CERT and AIRFLOW__API__SSL_KEY set to corporate certificates.

How to reproduce

Point AIRFLOW__API__SSL_CERT and AIRFLOW__API__SSL_KEY to your corporate cert, set your AIRFLOW__API__BASE_URL as https://<domain>:<port>, run a simple DAG, such as:

@dag
def test_executor():

    @task
    def do_something():

        print('Doing something...')

        return "Done"

    do_something()

test_executor()

Operating System

RHEL 8.10

Versions of Apache Airflow Providers

apache-airflow==3.0.6
apache-airflow-core==3.0.6
apache-airflow-providers-celery==3.12.2
apache-airflow-providers-common-compat==1.7.3
apache-airflow-providers-common-io==1.6.2
apache-airflow-providers-common-sql==1.27.5
apache-airflow-providers-fab==2.4.1
apache-airflow-providers-ftp==3.13.2
apache-airflow-providers-git==0.0.6
apache-airflow-providers-hashicorp==4.3.2
apache-airflow-providers-http==5.3.3
apache-airflow-providers-imap==3.9.2
apache-airflow-providers-microsoft-azure==12.6.1
apache-airflow-providers-microsoft-mssql==4.3.2
apache-airflow-providers-microsoft-psrp==3.1.2
apache-airflow-providers-mysql==6.3.3
apache-airflow-providers-odbc==4.10.2
apache-airflow-providers-sftp==5.3.4
apache-airflow-providers-smtp==2.2.0
apache-airflow-providers-sqlite==4.1.2
apache-airflow-providers-ssh==4.1.3
apache-airflow-providers-standard==1.6.0
apache-airflow-task-sdk==1.0.6

Deployment

Virtualenv installation

Deployment details

Python 3.12, virtualenv, official pip installation with constraints.

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    area:APIAirflow's REST/HTTP APIarea:corekind:bugThis is a clearly a bugneeds-triagelabel for new issues that we didn't triage yet

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions