This repository has been archived by the owner on Apr 8, 2024. It is now read-only.
Option to set custom dtype values for models using postgres adapter #768
Labels
feature request
New feature or request
Context
I often work with PostgreSQL
jsonb
columns in my dbt models. Withdbt-fal
and a basepostgres
profile, there isn't a way to passdtype
options to SQLAlchemy, making it impossible to generatejsonb
and similar column types in the resulting table.Is your feature request related to a problem? Please describe.
In my test Python model, one of the dataframe columns contains JSON serialized data. The resulting table in PostgreSQL has this column as a
text
type. But, what I want is for the database to store that data as ajsonb
type. I need a way to specify the desired type to thedataframe.to_sql
call thatdbt-fal
performs.Describe the solution you'd like
Perhaps a config option?
postgres.write_df_to_relation
would then pass this todata.to_sql
.Describe alternatives you've considered
I tried to use a dbt
post-hook
config on the model to alter the column type. But, mypost-hook
statements never get called. I'm not sure why this is.Additional context
This is different than specifying dtypes on the dataframe columns themselves. i.e., the data is correctly treated as object/string in the dataframe. It's when then dataframe is materialized in SQL that the JSON type applies.
Is there an existing feature request for this?
The text was updated successfully, but these errors were encountered: