-
Notifications
You must be signed in to change notification settings - Fork 25
Open
Labels
enhancementNew feature or requestNew feature or request
Milestone
Description
Is your feature request related to a problem? Please describe.
Based on the experience we gained with the DotNet stream client, I'd like to report some best practices we should adopt on version 2 of this client.
Describe the solution you'd like
Tests do:
- Create and Destroy producers and consumers in a short time
- Create and Destroy producers and consumers in a short time in a multi-thread way
- Create producers and consumers quickly and close them in another thread.
- Test the auto-reconnect when the broker is restarted
- Test the auto-reconnect and try to delete the stream during the reconnection ([edge case] Error during reconnect if the stream in the meantime is delete rabbitmq-stream-dotnet-client#335)
Edge cases to handle:
- Single Active Consumer: When a consumer is not active anymore, it should stop parsing chunks even if there are messages pending
- Consumer: As soon the consumer requests to close, it should stop delivering the messages
- Producer and Consumer: The TCP client could receive data even if the producer and/or consumer are closed. So the client has to ignore that data and maybe log them like .NET
- Metadata update: The client receives the metadata updated when a stream is deleted or a replica removed. Even if the client is connected to another node. The server will drop all the consumers and producers. So the client needs to reconnect if the stream exists or close the connection if it does not exist and it is possible to release the connection.
- Auto-Reconnect:
StreamDoesNotExist
is different fromStreamNotAvailable
the second one could be a temporary situation so the client has to re-try again
Describe alternatives you've considered
No response
Additional context
No response
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request