-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
coq-par-compile: use hash for ancestors #503
coq-par-compile: use hash for ancestors #503
Conversation
The hash avoids an exponentially growing number of duplicates in the ancestor collection. Fixes ProofGeneral#499
Where can I find something about the test that is failing here? |
Hi @hendriktews, you can just click on the Checks tab of the PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@hendriktews albeit I'm not very elisp-hash-savvy, the PR LGTM and compiles fines… do you need that @RalfJung performs another test before the merge? or just when your patch will be available on MELPA.
Yes, but there I only see that test (8.10, minimal) fails. Where can I find a description of the test that fails or its source code? Finding out that it is 020_coq-test-definition is already difficult, because this is somewhere hidden in the middle of the log. I really appreciate that we have continuous integration now and even with automated tests! I am willing to investigate, when something fails on one of my RP's, but I don't really know how. |
Yes, the interface of GitHub Actions logs is not extremely intuitive, but to summarize:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested it on @RalfJung's repo and also tested it with an compilation error and
coq-compile-keep-going
...
OK, thanks @hendriktews! 👍
So if no one objects, let's merge your PR on tomorrow Tuesday
The hash avoids an exponentially growing number of duplicates in
the ancestor collection. Fixes #499