Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add airflow operator and hook for ClickHouse #3699

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions docs/integrations/engines/clickhouse.md
Original file line number Diff line number Diff line change
Expand Up @@ -394,6 +394,26 @@ If a model has many records in each partition, you may see additional performanc
## Local/Built-in Scheduler
**Engine Adapter Type**: `clickhouse`

## Airflow Scheduler
**Engine Name:** `clickhouse`

In order to share a common implementation across local and Airflow, SQLMesh ClickHouse implements its own hook and operator.

By default, the connection ID is set to `sqlmesh_clickhouse_default`, but can be overridden using the `engine_operator_args` parameter to the `SQLMeshAirflow` instance as in the example below:
```python linenums="1"
from sqlmesh.schedulers.airflow import NO_DEFAULT_CATALOG

sqlmesh_airflow = SQLMeshAirflow(
"clickhouse",
default_catalog=NO_DEFAULT_CATALOG,
engine_operator_args={
"sqlmesh_clickhouse_conn_id": "<Connection ID>"
},
)
```

Note: `NO_DEFAULT_CATALOG` is required for ClickHouse since ClickHouse doesn't support catalogs.

### Connection options

| Option | Description | Type | Required |
Expand Down
34 changes: 34 additions & 0 deletions sqlmesh/schedulers/airflow/hooks/clickhouse.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
from __future__ import annotations

import typing as t

from airflow.providers.common.sql.hooks.sql import DbApiHook

if t.TYPE_CHECKING:
from clickhouse_connect.dbapi.connection import Connection


class SQLMeshClickHouseHook(DbApiHook):
"""
Uses the ClickHouse Python DB API connector.
"""

conn_name_attr = "sqlmesh_clickhouse_conn_id"
default_conn_name = "sqlmesh_clickhouse_default"
conn_type = "sqlmesh_clickhouse"
hook_name = "SQLMesh ClickHouse"

def get_conn(self) -> Connection:
"""Returns a ClickHouse connection object"""
from clickhouse_connect.dbapi import connect

db = self.get_connection(getattr(self, t.cast(str, self.conn_name_attr)))

return connect(
host=db.host,
port=db.port,
username=db.login,
password=db.password,
database=db.schema,
**db.extra_dejson,
)
32 changes: 32 additions & 0 deletions sqlmesh/schedulers/airflow/operators/clickhouse.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
from __future__ import annotations

import typing as t


from sqlmesh.schedulers.airflow.hooks.clickhouse import SQLMeshClickHouseHook
from sqlmesh.schedulers.airflow.operators.base import BaseDbApiOperator
from sqlmesh.schedulers.airflow.operators.targets import BaseTarget


class SQLMeshClickHouseOperator(BaseDbApiOperator):
"""The operator that evaluates a SQLMesh model snapshot on a ClickHouse target

Args:
target: The target that will be executed by this operator instance.
postgres_conn_id: The Airflow connection id for the postgres target.
"""

def __init__(
self,
*,
target: BaseTarget,
clickhouse_conn_id: str = SQLMeshClickHouseHook.default_conn_name,
**kwargs: t.Any,
) -> None:
super().__init__(
target=target,
conn_id=clickhouse_conn_id,
dialect="clickhouse",
hook_type=SQLMeshClickHouseHook,
**kwargs,
)
4 changes: 4 additions & 0 deletions sqlmesh/schedulers/airflow/util.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,6 +122,10 @@ def discover_engine_operator(name: str, sql_only: bool = False) -> t.Type[BaseOp
name = name.lower()

try:
if name == "clickhouse":
from sqlmesh.schedulers.airflow.operators.clickhouse import SQLMeshClickHouseOperator

return SQLMeshClickHouseOperator
if name == "spark":
from sqlmesh.schedulers.airflow.operators.spark_submit import (
SQLMeshSparkSubmitOperator,
Expand Down