You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey everyone! I have been interacting with the Gomu Gomu project for a few days now as I was implementing the cache layer over Madara. This gave me some time to think about some ideas for benchmark flows, and after discussing with @d-roak we decided to open an issue to talk about it.
Separation of concerns
I do not think that this bring anything new under the sun compared to the initial issue by @abdelhamidbakhta but I wanted to give my whole thinking process/
From my understanding there are a few elements driving the development of the project, but the main one seems to be to have a standard benchmark toolkit for every Starknet sequencers implementation. This pushes for a CLI that could be used over any sequencer RPC endpoint.
However, I think that each sequencer will also have the need of being able to implement easily their own benchmark flow, so that should be easily achieved, either in the sequencer repo or in this one.
By splitting the Gomu Gomu implementation in three crates (or two if three seems an overkill), I think we could achieve that. My idea was:
cli crate: contains interface to interact with the shooter logic through a terminal. Practical to easily integrate in CI and use in a dev environment
lib crate: contains the core logic of the shooter. Is responsible to run the benchmark flows, measure performances and generate reports
flow crate: contains the logic for each benchmark flow, along with their own ephemeral state during testing.
Here lib and flow could be merged but I liked the idea of having each crate responsible for different elements.
Implementation of the flow execution
I think the current GatlingShooter is really good at what it does and we should not touch it too much. However, I would change the way we run benchmark.
For starters, I would try to abstract a benchmark logic as much as possible from GatlingShooter and making it generic to accept a trait we define. Taking from tester::Bencher for inspiration we could have this method added to run flows:
implGatlingShooter{// Arguments available in closure should be iterated upon to identify everything needed during a benchmark.fniter<T>(&mutself,flow:Flow<T>,count:u64){// Run closure necessary number of time while measure the time taken each time// Save all measured performances in GatlingShooter}
Then we could declare flow over a trait:
traitFlow<T>{// Called by GatlingShooter::iter() before running each closurefninitialize(gatling_shooter:GatlingShooter) -> Result<Self,Error>whereSelf:Sized;// Method to be run by GatlingShooter::iter()fnexecution(rpc:Arc<JsonRpcClient<HttpTransport>>,accounts:Vec<SingleOwnerAccount<Arc<JsonRpcClient<HttpTransport>>,LocalWallet>>) -> T;// Maybe add finalize() here ?}
If we consider having a flow crate, all standard flows defined in Gomu Gomu could be exported over an enumeration to ease logic in the cli crate. We could define a proc macro to auto-generate it at build time.
With this kind of setup it would also be quite easy for any project directly using the lib crate directly to define their own flow to in Gomu Gomu.
Miscellaneous thoughts
It might be interesting to be able to re-initialize the state of the node we are running against as to test a consistent behavior. Is it really interesting? Could we achieve that?
Is there a need to pass specific command line argument for a given benchmark flow?
Looking forward to here thoughts on this or answer any questions!
The text was updated successfully, but these errors were encountered:
There hasn't been any activity on this issue recently, and in order to prioritize active issues, it will be marked as stale.
Please make sure to update to the latest version and check if that solves the issue. Let us know if that works for you by leaving a 👍
Because this issue is marked as stale, it will be closed and locked in 7 days if no further activity occurs.
Thank you for your contributions!
Hey everyone! I have been interacting with the Gomu Gomu project for a few days now as I was implementing the cache layer over Madara. This gave me some time to think about some ideas for benchmark flows, and after discussing with @d-roak we decided to open an issue to talk about it.
Separation of concerns
I do not think that this bring anything new under the sun compared to the initial issue by @abdelhamidbakhta but I wanted to give my whole thinking process/
From my understanding there are a few elements driving the development of the project, but the main one seems to be to have a standard benchmark toolkit for every Starknet sequencers implementation. This pushes for a CLI that could be used over any sequencer RPC endpoint.
However, I think that each sequencer will also have the need of being able to implement easily their own benchmark flow, so that should be easily achieved, either in the sequencer repo or in this one.
By splitting the Gomu Gomu implementation in three crates (or two if three seems an overkill), I think we could achieve that. My idea was:
cli
crate: contains interface to interact with the shooter logic through a terminal. Practical to easily integrate in CI and use in a dev environmentlib
crate: contains the core logic of the shooter. Is responsible to run the benchmark flows, measure performances and generate reportsflow
crate: contains the logic for each benchmark flow, along with their own ephemeral state during testing.Here
lib
andflow
could be merged but I liked the idea of having each crate responsible for different elements.Implementation of the flow execution
I think the current
GatlingShooter
is really good at what it does and we should not touch it too much. However, I would change the way we run benchmark.For starters, I would try to abstract a benchmark logic as much as possible from
GatlingShooter
and making it generic to accept a trait we define. Taking fromtester::Bencher
for inspiration we could have this method added to run flows:Then we could declare flow over a trait:
If we consider having a
flow
crate, all standard flows defined in Gomu Gomu could be exported over an enumeration to ease logic in thecli
crate. We could define a proc macro to auto-generate it at build time.With this kind of setup it would also be quite easy for any project directly using the
lib
crate directly to define their own flow to in Gomu Gomu.Miscellaneous thoughts
Looking forward to here thoughts on this or answer any questions!
The text was updated successfully, but these errors were encountered: