You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to take join between a kafka stream and a ktable. The poc works fine with stream data. However, when I use CloudEvent, I keep running into some or other issue related to serialization.
18:12:08.691 [basic-streams-updated-0630c691-0080-4e02-8c85-7bff650f34e9-StreamThread-1] ERROR org.apache.kafka.streams.KafkaStreams - stream-client [basic-streams-updated-0630c691-0080-4e02-8c85-7bff650f34e9] Encountered the following exception during processing and the registered exception handler opted to SHUTDOWN_CLIENT. The streams client is going to shut down now. org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000002, topic=cloudevent-ktable, partition=0, offset=80, stacktrace=java.lang.UnsupportedOperationException: CloudEventSerializer supports only the signature serialize(String, Headers, CloudEvent)
Caused by: java.lang.UnsupportedOperationException: CloudEventSerializer supports only the signature serialize(String, Headers, CloudEvent) at io.cloudevents.kafka.CloudEventSerializer.serialize(CloudEventSerializer.java:84) ~[cloudevents-kafka-2.5.0.jar:?] at io.cloudevents.kafka.CloudEventSerializer.serialize(CloudEventSerializer.java:38) ~[cloudevents-kafka-2.5.0.jar:?] at org.apache.kafka.streams.state.internals.ValueAndTimestampSerializer.serialize(ValueAndTimestampSerializer.java:82) ~[kafka-streams-2.8.0.jar:?] at
org.apache.kafka.streams.state.internals.ValueAndTimestampSerializer.serialize(ValueAndTimestampSerializer.java:73) ~[kafka-streams-2.8.0.jar:?] at
org.apache.kafka.streams.state.internals.ValueAndTimestampSerializer.serialize(ValueAndTimestampSerializer.java:30) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.state.StateSerdes.rawValue(StateSerdes.java:192) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.state.internals.MeteredKeyValueStore.lambda$put$4(MeteredKeyValueStore.java:200) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:884) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.state.internals.MeteredKeyValueStore.put(MeteredKeyValueStore.java:200) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.processor.internals.AbstractReadWriteDecorator$KeyValueStoreReadWriteDecorator.put(AbstractReadWriteDecorator.java:120) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.kstream.internals.KTableSource$KTableSourceProcessor.process(KTableSource.java:122) ~[kafka-streams-2.8.0.jar:?]
18:12:08.691 [basic-streams-updated-0630c691-0080-4e02-8c85-7bff650f34e9-StreamThread-1] ERROR org.apache.kafka.streams.KafkaStreams - stream-client [basic-streams-updated-0630c691-0080-4e02-8c85-7bff650f34e9] Encountered the following exception during processing and the registered exception handler opted to SHUTDOWN_CLIENT. The streams client is going to shut down now. org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000002, topic=cloudevent-ktable, partition=0, offset=80, stacktrace=java.lang.UnsupportedOperationException: CloudEventSerializer supports only the signature serialize(String, Headers, CloudEvent)
Caused by: java.lang.UnsupportedOperationException: CloudEventSerializer supports only the signature serialize(String, Headers, CloudEvent) at io.cloudevents.kafka.CloudEventSerializer.serialize(CloudEventSerializer.java:84) ~[cloudevents-kafka-2.5.0.jar:?] at io.cloudevents.kafka.CloudEventSerializer.serialize(CloudEventSerializer.java:38) ~[cloudevents-kafka-2.5.0.jar:?] at org.apache.kafka.streams.state.internals.ValueAndTimestampSerializer.serialize(ValueAndTimestampSerializer.java:82) ~[kafka-streams-2.8.0.jar:?] at
org.apache.kafka.streams.state.internals.ValueAndTimestampSerializer.serialize(ValueAndTimestampSerializer.java:73) ~[kafka-streams-2.8.0.jar:?] at
org.apache.kafka.streams.state.internals.ValueAndTimestampSerializer.serialize(ValueAndTimestampSerializer.java:30) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.state.StateSerdes.rawValue(StateSerdes.java:192) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.state.internals.MeteredKeyValueStore.lambda$put$4(MeteredKeyValueStore.java:200) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.processor.internals.metrics.StreamsMetricsImpl.maybeMeasureLatency(StreamsMetricsImpl.java:884) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.state.internals.MeteredKeyValueStore.put(MeteredKeyValueStore.java:200) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.processor.internals.AbstractReadWriteDecorator$KeyValueStoreReadWriteDecorator.put(AbstractReadWriteDecorator.java:120) ~[kafka-streams-2.8.0.jar:?] at org.apache.kafka.streams.kstream.internals.KTableSource$KTableSourceProcessor.process(KTableSource.java:122) ~[kafka-streams-2.8.0.jar:?]
Am I missing anything, any help from CloudEvent team is appreciated.
The text was updated successfully, but these errors were encountered:
I want to take join between a kafka stream and a ktable. The poc works fine with stream data. However, when I use CloudEvent, I keep running into some or other issue related to serialization.
Here is my code sample -
I also tried using WrapperSerde - Issue with configuring Serdes for Kafka Streams
However I keep running into exception -
Am I missing anything, any help from CloudEvent team is appreciated.
The text was updated successfully, but these errors were encountered: