Skip to content

Copying very large repo (30k issues, 33k PRs) #84

@velle

Description

@velle

I want to keep a local copy of metadata for qgis/QGIS, which is quite large with 30k issues and 33k pull requests. Has anyone used github-to-sqlite for anything like that?

The GitHub API has a rate limit of 5,000 API calls per hours, for authenticated users. And I think it takes one call for each issue - am I right? And github-to-sqlite does not offer only fetching only some subset of the issues, and furthermore does not commit to sqlite before all issues are fetched (so in case a call is interrupted, e.g. by the rate limit, the fetched data is simply discarded).

So I don't see any way that I can actually use github-to-sqlite for this (unless I pay for such privileges to github). Can you confirm?

Sincerely, Thomas

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions