-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Warming the cache #59
Comments
dataloader has a prime function for this purpose. So if we provided something similar, then instead of def warm(record)
key = record.public_send(@primary_key)
promise = cache[cache_key(key)] ||= Promise.new.tap { |promise| promise.source = self }
promise.fulfill(record) unless promise.fulfilled?
end you could have def warm(record)
prime(record.id, record)
end or just use I'm not as sure how the graphql-ruby plugin should work though. In your example it provides a Are you only warming the cache using the context? If so, why not just use the context to get the record you want (e.g. current user) instead of going through a batch loader? Also, if we allow a proc to be provided to prime the cache, then what should happen after a mutation field is resolved? In that case we clear the cache. Should we be priming the cache after it is cleared? That would assume that anything in the context is updated according to the mutation. dataloader doesn't have these problems since the loaders typically would be part of the context and it isn't integrated with graphql-js to automatically clear the cache after mutation fields are resolved. Instead, it looks like they expect the mutations to invalidate cache keys, although that seems like an error prone approach. |
I suppose I may as well dump this here, in case it can be useful for someone. I attempted this when trying to optimize a query that loads 40k+ records at the first level. Preloaders and promises were eating away at all my 🐏 A couple of notes on this:
module Loaders
class PreloadLoader < GraphQL::Batch::Loader
def initialize(model, key: :id, value: nil, joins: nil, scope: nil)
if scope.nil?
query = model.all
else
query = model.public_send(scope)
end
query = query.joins(joins) if !joins.nil?
if value.present?
@map = query.pluck(key, value).to_h
else
@map = query.index_by{|rec| rec.public_send(key)}
end
super()
end
def load(key)
@map[key]
end
end
end |
We load some records to place into the graphql context before running a query. I'd like to warm the graphql batch loader caches with these records. At the moment I'm doing something like this (with a contrived example):
This feels like it dips a little too much into this gem's responsibilities. Is there any interest including some sort of cache warming facility in this gem?
Update: I've simplified the example.
The text was updated successfully, but these errors were encountered: