This repository has been archived by the owner on Mar 24, 2023. It is now read-only.
DBT sink + Flink with Kafka consumer reading from Redpanda and sending messages to an http endpoint #7
ivanaslamov
started this conversation in
Show and tell
Replies: 1 comment
-
Thanks for sharing your project, @ivanaslamov! Using Let us know if you give it another go! @bobbyiliev and @joacoc are more proficient with |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I wanted to create something that potentially can be used as a starting point to implement reverse ETL pipeline using materialized, and redpanda.
In https://github.com/ivanaslamov/mz-hack-day-2022 I introduced a new sink that sends flights data to another redpanda topic (in my project I re-used flight data materializedview as is but it can be any other view). Once back in redpanda the events are then getting consumed by a Flink application that sends them further to an http endpoint.
To run the project first use
docker-compose up
, then rundbt run
like in instructions, list redpanda topics to check name of the materialized view and update it inApplication.scala
code so it can read from the right place, update http endpoint in the flink http sink code and restart flink container.Note for future, need to better familiarize myself with materialize
TAIL
mechanism to potentially improve that implementation further.Beta Was this translation helpful? Give feedback.
All reactions