Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PR to track down CI failures #501

Merged
merged 8 commits into from
Jan 4, 2024
Merged

PR to track down CI failures #501

merged 8 commits into from
Jan 4, 2024

Conversation

kayabaNerve
Copy link
Member

For some reason, some processors (12.5-25%) aren't deciding to start the key gen. The only potential candidate I have right now is the hyper update performed.

@kayabaNerve kayabaNerve added the bug Something isn't working label Jan 1, 2024
@kayabaNerve
Copy link
Member Author

Passes locally.

An additional layer which protects us against edge cases with Zeroizing
(objects which don't support it or don't miss it).
@kayabaNerve
Copy link
Member Author

The most recent commit I believe was a mis-diagnosis on my end (we claim we're re-attempting completed items, we just don't preprocess for them).

@kayabaNerve
Copy link
Member Author

The most recent two commits had their CI passed despite the past three commits not supposed to be enacting change. Likely just shows how spotty the setup failures are?

@kayabaNerve
Copy link
Member Author

@kayabaNerve
Copy link
Member Author

It could be a fluke with the re-attempt extension resolving it?

@kayabaNerve
Copy link
Member Author

Local run failed. There are no logs of any errors in the message-queue client. The message-queue client seems to be connecting without issue, just never yielding a message.

The message-queue logs do note the messages being queued though. I really have no idea what's up.

@kayabaNerve
Copy link
Member Author

kayabaNerve commented Jan 2, 2024

The message-queue should be rewritten to automatically send messages upon their occurrence. I could work on such a rewrite and see if that happens to fix this?

@kayabaNerve
Copy link
Member Author

The recent cargo update updated tokio. I believe this is failing to open the tokio socket, yet it doesn't have a timeout so it doesn't realize it and never errors.

@kayabaNerve
Copy link
Member Author

There's definitely been mis-diagnoses here.

My local logs have "couldn't connect to message-queue server" due to a lookup failure, yet it doesn't seem to retry. On error, it should continue, sleep 5 seconds, and retry...

@kayabaNerve
Copy link
Member Author

c0f42ef resolves it.

A couple of commits of work I was doing was placed here for my personal convenience, to be clear (the zalloc commit, for instance).

@kayabaNerve kayabaNerve merged commit 7eb388e into develop Jan 4, 2024
18 of 20 checks passed
@kayabaNerve kayabaNerve deleted the ci-experiments branch January 4, 2024 06:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant