Skip to content

Commit

Permalink
Merge pull request #33 from fivetran/bugfix/user_unnest_warehouses
Browse files Browse the repository at this point in the history
bugfix/user_unnest_warehouses
  • Loading branch information
fivetran-catfritz authored Jul 10, 2023
2 parents 0c18eda + 53d450a commit 0aeb17e
Show file tree
Hide file tree
Showing 9 changed files with 27 additions and 15 deletions.
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
# dbt_iterable v0.9.0
[PR #33](https://github.com/fivetran/dbt_iterable/pull/33) includes the following update:
## 🚨 Breaking Changes 🚨 (recommend `--full-refresh` for Bigquery and Snowflake users)
- Updated intermediate model `int_iterable__list_user_unnest` to make sure empty array-rows are not removed for all warehouses.
- **Bigquery and Snowflake users**: this affects downstream models `iterable__users` and `iterable__list_user_history`. We recommend using `dbt run --full-refresh` the next time you run your project.

# dbt_iterable v0.8.0
[PR #30](https://github.com/fivetran/dbt_iterable/pull/30) includes the following updates:
## 🚨 Breaking Changes 🚨 (recommend `--full-refresh`)
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Include the following Iterable package version in your `packages.yml` file.
```yaml
packages:
- package: fivetran/iterable
version: [">=0.8.0", "<0.9.0"]
version: [">=0.9.0", "<0.10.0"]
```
## Step 3: Define database and schema variables
By default, this package runs using your destination and the `iterable` schema of your [target database](https://docs.getdbt.com/docs/running-a-dbt-project/using-the-command-line-interface/configure-your-profile). If this is not where your Iterable data is located (for example, if your Iterable schema is named `iterable_fivetran`), add the following configuration to your root `dbt_project.yml` file:
Expand Down
2 changes: 1 addition & 1 deletion dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: 'iterable'
version: '0.8.0'
version: '0.9.0'
config-version: 2
require-dbt-version: [">=1.3.0", "<2.0.0"]
models:
Expand Down
2 changes: 1 addition & 1 deletion docs/catalog.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/manifest.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/run_results.json

Large diffs are not rendered by default.

10 changes: 5 additions & 5 deletions integration_tests/ci/sample.profiles.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,13 @@ integration_tests:
pass: "{{ env_var('CI_REDSHIFT_DBT_PASS') }}"
dbname: "{{ env_var('CI_REDSHIFT_DBT_DBNAME') }}"
port: 5439
schema: iterable_integration_tests_03
schema: iterable_integration_tests_04
threads: 8
bigquery:
type: bigquery
method: service-account-json
project: 'dbt-package-testing'
schema: iterable_integration_tests_03
schema: iterable_integration_tests_04
threads: 8
keyfile_json: "{{ env_var('GCLOUD_SERVICE_KEY') | as_native }}"
snowflake:
Expand All @@ -33,7 +33,7 @@ integration_tests:
role: "{{ env_var('CI_SNOWFLAKE_DBT_ROLE') }}"
database: "{{ env_var('CI_SNOWFLAKE_DBT_DATABASE') }}"
warehouse: "{{ env_var('CI_SNOWFLAKE_DBT_WAREHOUSE') }}"
schema: iterable_integration_tests_03
schema: iterable_integration_tests_04
threads: 8
postgres:
type: postgres
Expand All @@ -42,13 +42,13 @@ integration_tests:
pass: "{{ env_var('CI_POSTGRES_DBT_PASS') }}"
dbname: "{{ env_var('CI_POSTGRES_DBT_DBNAME') }}"
port: 5432
schema: iterable_integration_tests_03
schema: iterable_integration_tests_04
threads: 8
databricks:
catalog: "{{ env_var('CI_DATABRICKS_DBT_CATALOG') }}"
host: "{{ env_var('CI_DATABRICKS_DBT_HOST') }}"
http_path: "{{ env_var('CI_DATABRICKS_DBT_HTTP_PATH') }}"
schema: iterable_integration_tests_03
schema: iterable_integration_tests_04
threads: 8
token: "{{ env_var('CI_DATABRICKS_DBT_TOKEN') }}"
type: databricks
4 changes: 2 additions & 2 deletions integration_tests/dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
config-version: 2
name: 'iterable_integration_tests'
version: '0.8.0'
version: '0.9.0'
profile: 'integration_tests'
vars:
iterable_source:
iterable_schema: iterable_integration_tests_03
iterable_schema: iterable_integration_tests_04
iterable_campaign_history_identifier: "campaign_history_data"
iterable_campaign_label_history_identifier: "campaign_label_history_data"
iterable_campaign_list_history_identifier: "campaign_list_history_data"
Expand Down
12 changes: 9 additions & 3 deletions models/intermediate/int_iterable__list_user_unnest.sql
Original file line number Diff line number Diff line change
Expand Up @@ -83,13 +83,19 @@ with user_history as (

{% if target.type == 'snowflake' %}
cross join
table(flatten(input => parse_json(email_list_ids))) as email_list_id
table(flatten(input => parse_json(
case when email_list_ids = '[]' then '["is_null"]' {# to not remove empty array-rows #}
else email_list_ids end))) as email_list_id
{% elif target.type == 'bigquery' %}
cross join
unnest(JSON_EXTRACT_STRING_ARRAY(email_list_ids)) as email_list_id
unnest(JSON_EXTRACT_STRING_ARRAY(
case when email_list_ids = '[]' then '["is_null"]' {# to not remove empty array-rows #}
else email_list_ids end)) as email_list_id
{% elif target.type in ('spark','databricks') %}
cross join
lateral explode_outer(from_json(email_list_ids, 'array<int>')) as email_list_id
lateral explode_outer(from_json(
case when email_list_ids = '[]' then '["is_null"]' {# to not remove empty array-rows #}
else email_list_ids end, 'array<int>')) as email_list_id
{% else %} {# target is postgres #}
cross join
json_array_elements_text(cast((
Expand Down

0 comments on commit 0aeb17e

Please sign in to comment.