-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-50700][SQL] spark.sql.catalog.spark_catalog
supports builtin
magic value
#49332
base: master
Are you sure you want to change the base?
Conversation
@cloud-fan @yaooqinn @huaxingao please let me know if this approach makes sense, or do you have any other suggestions to allow users to set |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sense to me.
.version("3.0.0") | ||
.stringConf | ||
.createOptional | ||
.createWithDefault("builtin") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we normalize it to lower case? e.g. .transform(_.toLowerCase(Locale.ROOT))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks for suggestion, updated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry I'm wrong... It's class name so we can't lower case it.
Maybe the string comparisons should be case insensitive.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, right, let me change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@cloud-fan how about now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
makes sense to me
kindly ping @cloud-fan, can we get this in? |
What changes were proposed in this pull request?
This PR adds a magic value
builtin
(and sets it to the default value) forspark.sql.catalog.spark_catalog
.Why are the changes needed?
Currently,
spark.sql.catalog.spark_catalog
is optional and hasNone
as the default value. Whenspark.sql.catalog.spark_catalog=a.bad.catalog.impl
is wrongly set inspark-defaults.conf
, the user has no way to overwrite it inspark-submit
.Note that, explicitly setting it to
o.a.s.sql.execution.datasources.v2.V2SessionCatalog
does not work either, becauseV2SessionCatalog
does not have a zero-args constructor.To fix the above issue, similar to what we did for
spark.sql.hive.metastore.jars
, just use "builtin" to represent the built-inV2SessionCatalog
.Does this PR introduce any user-facing change?
No change for default behavior, and users are allowed to use
spark.sql.catalog.spark_catalog=builtin
to setspark_catalog
as the built-inV2SessionCatalog
.How was this patch tested?
Code in UTs like
are replaced with
Was this patch authored or co-authored using generative AI tooling?
No.