-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] __tmp_not_partitioned CTAS inherited model config "external_location" and causing clashing with the model data table #683
Comments
@u-ra-ra-ra feel free to propose a pr and we will be happy to review it. |
I will hopefully have some time next week to give it shot. |
@lucastrubiano thanks for spotting this issue. Do you mind to raise a PR with your fix? Thanks |
@nicor88 Glad to push the changes, but I don’t have permissions to create a branch or PR. Any guidance on how to proceed? |
@lucastrubiano |
Is this a new bug in dbt-athena?
Current Behavior
I have a model that writes out to more than 100 hive partitions.
I recently dropped the table and attempted to recreate via dbt build. I have an external_location in configuration as I was migrated onto dbt and want to avoid rebuild data / move s3 location.
The unexpected behavior is dbt build succeeded but my s3 path does not contain any data.
dbt / adaptor version:
The CTAS generated:
Expected Behavior
This XX__tmp_not_partitioned should not use the configured external path which is clashing with the actual model. Instead it should either use a unique tmp location or same as __dbt_tmp table which goes to s3_data_dir/schema/table__dbt_tmp
Steps To Reproduce
create a partitioned model with more than 100 hive partitions, with an external_location in config.
If this table exists in glue, drop it.
Run dbt run to create more than 100 partitions in one sql run.
Environment
Additional Context
I am happy to contribute to get this fixed if that helps speed things up.
https://github.com/dbt-athena/dbt-athena/blob/main/dbt/include/athena/macros/materializations/models/table/create_table_as.sql
The text was updated successfully, but these errors were encountered: