-
-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data races #18
Comments
These two data races seem rare and hard to trigger. |
Could you share some of your code? |
Unfortunately I can't right now, but I will try to dig up more information and possibly even a reproduction test case. |
There are many things wrong, I'm working on a proper fix for the entire lib, but it will take a bit more time. |
Don't know if this is relevant, but I've experienced some of this:
I was able to fix it by just wrapping a Lock() and Unlock() around that c.lastEventID = message.id in channel.go:26 like this:
At least in initial testing, it resolves the data race. |
In my case, it was very reproducible and this instantly fixed it. |
Spoke too soon -- have another on sse.go:96:
|
Yes, it can happen anywhere, the lib is using goroutines where it's not needed and for that needs locks everywhere. The way to fix it is by removing most and leave only when required. |
Can reproduce issue with set last lastEventID. My pattern of usage send multiple messages of different time to the same client like multiple topics subscriptions. It easy appears in tests, due absence of significant delay. Race log is follow: WARNING: DATA RACE Previous write at 0x00c000436258 by goroutine 48: and the patch is exactly the same:
|
Protect the option by wrappring into a mutex. The datarace is gone. It closes alexandrevicenzi#18
https://gist.github.com/iscander/9a144528e563dc5969346d5e0b61cd4f yet another sample which rises same data race but as a part of HTTPHandler |
Protect the option by wrappring into a mutex. The datarace is gone. It closes alexandrevicenzi#18
With
[email protected]
and when testing publishing from multiple browsers with race detector running, encountered race conditions.Sorry, don't have the dump handy but had another data race on msg.retry write,
go-sse/sse.go
Line 94 in fcf9dce
(which is being fed by
SSE.SendMessage(chName, sse.NewMessage(strconv.Itoa(time.Now().Second()), string(msg), "msg"))
)I'm using some complex code as well, so it's possible that I'm not 'doing it right', but these two data races under load seem to point at these two locations (channel.go:26 and sse.go:94)
The text was updated successfully, but these errors were encountered: