Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot add column, name already exists #324

Open
siddiquebagwan opened this issue Jan 21, 2025 · 0 comments
Open

Cannot add column, name already exists #324

siddiquebagwan opened this issue Jan 21, 2025 · 0 comments

Comments

@siddiquebagwan
Copy link

I am reading data from kafka topic and getting error:

Caused by: java.lang.IllegalArgumentException: Cannot add column, name already exists: A.b.c
at org.apache.iceberg.relocated.com.google.common.base.Preconditions.checkArgument(Preconditions.java:445)
at org.apache.iceberg.SchemaUpdate.internalAddColumn(SchemaUpdate.java:156)
at org.apache.iceberg.SchemaUpdate.addColumn(SchemaUpdate.java:107)
at org.apache.iceberg.UpdateSchema.addColumn(UpdateSchema.java:107)
at io.tabular.iceberg.connect.data.SchemaUtils.lambda$commitSchemaUpdates$4(SchemaUtils.java:127)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)
at io.tabular.iceberg.connect.data.SchemaUtils.commitSchemaUpdates(SchemaUtils.java:126)
at io.tabular.iceberg.connect.data.SchemaUtils.lambda$applySchemaUpdates$0(SchemaUtils.java:93)
at org.apache.iceberg.util.Tasks$Builder.runTaskWithRetry(Tasks.java:413)
at org.apache.iceberg.util.Tasks$Builder.runSingleThreaded(Tasks.java:219)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:203)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:196)
at io.tabular.iceberg.connect.data.SchemaUtils.applySchemaUpdates(SchemaUtils.java:93)
at io.tabular.iceberg.connect.data.IcebergWriter.convertToRow(IcebergWriter.java:98)
at io.tabular.iceberg.connect.data.IcebergWriter.write(IcebergWriter.java:68)
... 20 more

This is my sink properties file:

name=iceberg-sink-connector
topics=sample-topic
connector.class=io.tabular.iceberg.connect.IcebergSinkConnector
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
key.converter.schemas.enable=false
iceberg.tables.evolve-schema-enabled=true
iceberg.catalog=fake-bucket
iceberg.catalog.type=hadoop
iceberg.table.name=bronze.simple1
iceberg.catalog.warehouse=
iceberg.tables.auto-create-enabled=true
iceberg.tables.schema-evolution.enabled=true
iceberg.catalog.s3.path-style-access=false
iceberg.tables.evolve-schema-enabled=true
iceberg.tables.dynamic-enabled=false
iceberg.tables.schema-force-optional=false
iceberg.tables.schema-case-insensitive=false
iceberg.catalog.io-impl=org.apache.iceberg.aws.s3.S3FileIO
commit.timeout.ms=120000
iceberg.commit.timeout-ms=60000
iceberg.tables=bronze.simple1
log4j.logger.io.confluent.connect.s3=DEBUG
iceberg.control.commit.timeout-ms=200000
iceberg.control.commit.interval-ms=200
iceberg.connection.timeout.ms=300000
iceberg.read.timeout.ms=300000

I tried below configuration as well

iceberg.tables.write-props.write.operation=append
consumer.write.operation=append
write.operation=merge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant