Performance limits for continuous streaming of data using typed arrays (JS) #2715
atmgrifter00
started this conversation in
General
Replies: 1 comment 8 replies
-
I'm coincidentally working on some benchmarks for this now. A few questions:
|
Beta Was this translation helpful? Give feedback.
8 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We're currently investigating the possibility of leveraging Perspective to serve as the host for both static and live sets of data. The live data could have a form of something akin to having 2 typed arrays (an Int32 index buffer, and a Float32 sample buffer) being packaged into an Apache Arrow table. Ultimately Perspective is being given an ArrayBuffer.
My initial findings is that, in a variety of different live data sample sizes and update rates, that Perspective seems to be able to process roughly 640K-680K rows per second (each row has two values). This is through using
table.replace(...)
to update the table for each update, and basically all of this time is being captured inside the WASM boundary (within Perspective).Does anyone know if Perspective should be able to handle higher update rates, or am I hitting a hard boundary? Our goal is to handle 1M samples per second across 4 traces at roughly 30 fps.
Note: I'll put together a StackBlitz example in a bit to help demonstrate what I'm doing, and link it here.
Beta Was this translation helpful? Give feedback.
All reactions