A semaphore implementation using promises. Forked from vercel/async-sema.
- Universal - Works in all modern browsers, Node.js, and Deno and supports CLI usage.
- Zero Dependencies - Absolutely no dependencies, keeping the package tiny (24kb).
- Tested - Greater than 98% test coverage.
- Typed - Out of the box TypeScript declarations.
Browsers |
Load sema4 directly from esm.sh
<script type="module">
import { Sema } from 'https://esm.sh/sema4';
</script> |
---|---|
Deno |
Load sema4 directly from esm.sh
import { Sema } from 'https://esm.sh/sema4?dts'; |
Node 18+ |
Install with import { Sema } from 'sema4'; |
Name | Type | Optional | Default | Description |
---|---|---|---|---|
maxConcurrency |
Integer | No | https://gitlab.com |
The maximum number of callers allowed to acquire the semaphore concurrently |
options.initFn |
Function | Yes | () => '1' |
The function that is used to initialize the tokens used to manage the semaphore |
options.pauseFn |
Function | Yes* | The function that is called to opportunistically request pausing the incoming stream of data, instead of piling up waiting promises and possibly running out of memory | |
options.resumeFn |
Function | Yes* | N/A | The function that is called when there is room again to accept new waiters on the semaphore. This function must be declared if a pauseFn is declared |
options.capacity |
Integer | Yes | 10 | Sets the size of the pre-allocated waiting list inside the semaphore. This is typically used by high performance where the developer can make a rough estimate of the number of concurrent users of a semaphore. |
Drains the semaphore and returns all the initialized tokens in an array. Draining is an ideal way to ensure there are no pending async tasks, for example before a process will terminate.
Returns the number of callers waiting on the semaphore, i.e. the number of pending promises.
Attempt to acquire a token from the semaphore, if one is available immediately. Otherwise, return undefined
.
Acquire a token from the semaphore, thus decrement the number of available execution slots. If initFn
is not used then the return value of the function can be discarded.
Release the semaphore, thus increment the number of free execution slots. If initFn
is used then the token
returned by acquire()
should be given as an argument when calling this function.
Creates a rate limit instance.
Name | Type | Optional | Default | Description |
---|---|---|---|---|
rate |
Integer | No | Number of tasks allowed per interval |
|
options.interval |
Integer | Yes | 1000 | Defines the width of the rate limiting window in milliseconds |
options.uniformDistribution |
Boolean | Yes | False | Enforces a discrete uniform distribution over time. Setting the uniformDistribution option is mainly useful in a situation where the flow of rate limit function calls is continuous and and occurring faster than interval (e.g. reading a file) and not enabling it would cause the maximum number of calls to resolve immediately (thus exhaust the limit immediately) and therefore the next bunch of calls would need to wait for interval milliseconds. However if the flow is sparse then this option may make the code run slower with no advantages. |
Acquires a semaphore and connects a timeout for its release. If the rate limit is reached, the execution process is halted until an available semaphore is released.
Releases all acquired semaphores immediately and resets the timeouts connected to them.
import { Sema } from 'sema4';
function foo() {
const s = new Sema(
4, // Allow 4 concurrent async calls
{
capacity: 100, // Preallocated space for 100 tokens
},
);
async function fetchData(x) {
await s.acquire();
try {
console.log(s.waiting() + ' calls to fetch are waiting');
// Perform some async tasks here...
} finally {
s.release();
}
}
return Promise.all(array.map(fetchData));
}
import { RateLimit } from 'sema4';
async function bar() {
const rl = new RateLimit(5); // Limit to 5 tasks per default time interval
for (let i = 0; i < n; i++) {
await rl.apply();
// Perform some async tasks here...
}
}
In addition to the contributors of the parent repository vercel/async-sema, these lovely people have helped keep this library going.