You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm having a hard time understanding jitter as it relates to my other params.
According to Base, DEFAULT_JITTER: = False, and it doesn't seem like DEFAULT_JITTER is ever updated when I call a new Limiter(). I know I can set it as an arg with Limiter(jitter=X).
I also saw in Base in the definition for _get_sleep_duration
It's clear that jitter can be a bool, int, or float, but I don't really understand how I should size it with respect to my consume or rate values. I don't want to add too much jitter that it ends up overpowering my sleep duration and I end up with a negative number.
Let's say I have these params:
rate = 20 # replenish tokens per second
consume = 1 # because I want my function to be allowed to run a maximum of 20 times per second
capacity = 300 # because I want my function to be able to run for 15 seconds maximum before it gets limited (function can't get called more than 300 times per minute max)
But according to the _get_sleep_duration calculation, my duration = (1 - 300) / 20 = -14.95 and so I'm already at a negative number.
Am I thinking about my params correctly?
What's the right way to add jitter as a param?
The text was updated successfully, but these errors were encountered:
Sounds good, thanks for the quick updates and response. I look forward to your additional feedback. I'm not quite sure how a token-bucket algorithm equates to my goals, but ideally I would be able to say:
This function is only able to run a maximum of 20 times per second,
and also a maximum of 300 times per minute.
In theory I would want an addition maximum of X per hour, but my function probably won't be alive for that long.
In addition to my negative number questions from my initial post, I looked at your code changes and noticed that if jitter == True, you're defining the amount as amount = random() / units and then returning duration - amount. If I leave units as the default (1000 ms), then random.random() generates a random float on the interval [0.0, 1.0). So assuming a max value for random jitter would be 1 / 1000 ms, or 0.0010 seconds max jitter. I feel like that's too low to be useful, but I don't have a good sense about what "good jitter" looks like, so maybe I'm wrong.
I'm having a hard time understanding jitter as it relates to my other params.
According to Base,
DEFAULT_JITTER: = False
, and it doesn't seem like DEFAULT_JITTER is ever updated when I call a new Limiter(). I know I can set it as an arg withLimiter(jitter=X)
.I also saw in Base in the definition for
_get_sleep_duration
It's clear that
jitter
can be a bool, int, or float, but I don't really understand how I should size it with respect to myconsume
orrate
values. I don't want to add too much jitter that it ends up overpowering my sleep duration and I end up with a negative number.Let's say I have these params:
rate = 20
# replenish tokens per secondconsume = 1
# because I want my function to be allowed to run a maximum of 20 times per secondcapacity = 300
# because I want my function to be able to run for 15 seconds maximum before it gets limited (function can't get called more than 300 times per minute max)But according to the _get_sleep_duration calculation, my
duration = (1 - 300) / 20 = -14.95
and so I'm already at a negative number.The text was updated successfully, but these errors were encountered: