Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KafkaProducer doesn’t flush #403

Open
Glutexo opened this issue Aug 16, 2019 · 0 comments
Open

KafkaProducer doesn’t flush #403

Glutexo opened this issue Aug 16, 2019 · 0 comments

Comments

@Glutexo
Copy link
Collaborator

Glutexo commented Aug 16, 2019

The Kafka producer in the MQ service writes the event messages by the send method. The sending is asynchronous though and unless a blocking flush method is called, the messages may be lost if the process exits. See the docs for an example.

When everything goes right, this doesn’t cause problems, because there is an event loop running, waiting for new messages. Because of that there is always time for the sent message to be actually produced.

If something happens though – an error occurs, an exception is raised, or the consumer times out, then the event loop ends and so does the whole process. It doesn’t wait for the producer thread and kills it. This can be prevented by calling the KafkaProducer.flush method after the loop ends and possible also when an exception occurs.

This needs to be implemented in the KafkaEventProducer wrapper.

Thinking out loud:

  • Maybe this is a case, where catching ’em all is actually a good idea?
  • Maybe run the consumer in a separate thread from the producer, so an exception doesn’t take down the whole process?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant