-
-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WiP: Main loop instrumentation #244
base: canon
Are you sure you want to change the base?
Conversation
Pulled in as of commit 1c897126a704fe848469ae8247a3f09e2e5a00a8
This is useful for collecting statistics or running an end-to-end test of PPB.
Also, make the frame's start time be its index in the dataframe
Feather is a binary data format that's lighter in space and processing time.
Possible now that the engine doesn't use an unconventional index
Instead of moving towards the humans, move the AI towards the point where its trajectory will intersect with the human's, assuming constant velocities. The assumption of constant velocities is violated in several ways: 1. the humans have slight, random angular motion, rather than going in straight lines; 2. the humans bounce off the sides of the game area; 3. once the AI gets close enough, the humans get scared, speeding up themselves and the AI. As the intercepts are recomputed at each frame, the random perturbations (1) are smoothly followed by the AI. Unlike those, (2) and (3) are large, punctual changes in velocity; those result in slightly-suboptimal steering (i.e. a better AI could catch up slightly faster) and discontinuities (i.e. the AI suddenly changes direction when its target bounces or gets close enough to be scared).
Previously, the slowest intercept was used, to guarantee that the solution is valid (i.e. t>0, the AI cannot travel backwards in time :P) However, in some configurations, two positive solutions are possible; we now select the smallest such solution (i.e. the fastest intercept).
This gives greater confidence that AISprite.intercept isn't raising ValueError; min() was expected to raise it when `targets` is empty (no runners are left), which happens at the end of the game.
Does the suggested instrumentation design makes sense? I pulled in pandas, which is a pretty heavy dependency (pulls in numpy and a bunch of other things), which I believe to be reasonable because:
|
Ohno, the nanosecond counters only exist in Python 3.7 and later :( |
Nanosecond timers are only available starting with Python 3.7
Currently, we are running the GC every 100th frame; as less basic implementation could use the collection statistics and average GC duration to decide when to run.
Avoid loading the pandas library and interacting with it until the game ends. This results in a 10× performance improvement (with profiling turned on), i.e. a reduction in instrumentation overhead of at least 90%. Moreover, pandas creates reference cycles which need to be cleaned up by the garbage collector, falsifying the GC-related measurements.
Pandas infers the correct datatypes in the first place. Moreover, coercing gc_unreachable from integers to floats is wrong.
Switched to pandas' built-in matplotlib integration.
OK, now that I have visualisation & stats output, I made the most basic control loop for the engine, and there is already a 12-fold improvement in jitter (0.9ms RMS -> 0.075ms RMS) and a large CPU usage reduction (we now spend the vast majority of our time sleeping). I would suggest the following changes to the PR before merging:
I'd like feedback from @ppb/maintainers before doing all that, though. Next steps after this PR could be adding GC scheduling (right now it's just being run every 100th iteration), so we can avoid running the GC when short on time, or when we predict no garbage is being produced. Without control loop (runs as fast as possible)
With control loop (aimed at ~120fps)
|
mutant-games/hugs
)For simplicity, I did it by defining an
AISprite
variant ofPlayerSprite
that implements custom behaviour, rather than simulating input events.This is entirely unnecessary, but I somewhat nerdsnipped myself.