You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Serializing huge amounts of data into the Store could cause large increases in memory usage. Currently _serialize_into_store processes batches by model, but loads all of the model's data into memory before saving the bulk writes to the Store model.
Deliverables
Relocate and split up _serialize_into_store into smaller functions
Have the serialization process the model objects in chunks, or via a queryset iterator() (ref)
The text was updated successfully, but these errors were encountered:
Overview
Serializing huge amounts of data into the
Store
could cause large increases in memory usage. Currently_serialize_into_store
processes batches by model, but loads all of the model's data into memory before saving the bulk writes to theStore
model.Deliverables
_serialize_into_store
into smaller functionsiterator()
(ref)The text was updated successfully, but these errors were encountered: