You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We recently put a link to our JSON versions of entities, which has resulted in Google crawling them.
With this, we've had a recurrence of the issues of memory consumption and request timeouts, because some of the entities have thousands of merged people and owned companies. This results in a lot of data, and a lot of memory needed to traverse the chains of ownerships.
On the page versions, we resolved this by paginating owned companies and merged people (indepedently). We could implement something similar within the JSON, but we'd need to:
Figure out how to specify the pagination in the response - currently we output a JSON list, we would presumably have to wrap that in an object with some extra parameters.
Document the pagination - we're effectively becoming more of an API here, so we need to document how it works.
Figure out how to actually paginate the data in the JSON - we do quite custom MongoDB queries for the page versions at the moment, but the equivalents of those queries are embedded in the graph traversal for the JSON. The same code is also used for the graph page and bulk export (and perhaps other things I can't remember).
The text was updated successfully, but these errors were encountered:
We recently put a link to our JSON versions of entities, which has resulted in Google crawling them.
With this, we've had a recurrence of the issues of memory consumption and request timeouts, because some of the entities have thousands of merged people and owned companies. This results in a lot of data, and a lot of memory needed to traverse the chains of ownerships.
On the page versions, we resolved this by paginating owned companies and merged people (indepedently). We could implement something similar within the JSON, but we'd need to:
The text was updated successfully, but these errors were encountered: