over optimisation on custom query #13501
Unanswered
mengyu-dev
asked this question in
Q&A
Replies: 1 comment
-
#13334 implements pushdown of dynamic filters to druid. I think it should do what you're expecting. Could you try it out and let us know if it works as expected ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We have one table in druid which contains many data and one table in postgres which contains only 10k small rows. We want to avoid the operation join, so we write the custom sql like this:
SELECT sum(A) from druid.druid.bigdata where id in (select id from postgres.public.smalldata where type = 'TYPE' )
select id from postgres.public.smalldata where type = 'TYPE' takes only 2 seconds.
IF we replace the parameters by the result in query, SELECT sum(A) from druid.druid.bigdata where id in (x,y,x...) takes some seconds.
As these two tables are in two seperated databases, we can't query it directly so we use Trino. But Trino translates the query with inner join (dynamic filtering) which is too slow.
Is there any solution to force the parsing as it writes to avoid optimisation ?
Beta Was this translation helpful? Give feedback.
All reactions