-
Notifications
You must be signed in to change notification settings - Fork 495
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add advanced SQL analytics to streaming cypher rows before returning results to the client #3610
Labels
Comments
vga91
added a commit
to vga91/neo4j-apoc-procedures
that referenced
this issue
Jan 24, 2025
…her rows before returning results to the client
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Imagine there was an aggregation function,
apoc.agg.analytics("SQL statement", {row-map}, optional SQL parameters) yield map/row
.By e.g. embedding duckdb JDBC we could create a temporary table/CTE over the inputs and run the provided SQL statement including WINDOW functions, PIVOT etc.
The JDBC driver would ofc, be an optional jar.
We could even have arbitrary JDBC URLs, where duckdb would just be one option that runs in process, to delegate complex analytics to an analytics db.
Or somehow as a subquery / docs?
The text was updated successfully, but these errors were encountered: