[Feature Request] A way to synchronize a local binary cache with a remote binary cache, e.g. nuget, on a successful remote read #37042
Replies: 2 comments
-
I seem to have been blind while reading the docs. |
Beta Was this translation helpful? Give feedback.
-
The issue has been closed as there seems to be a way to solve the problem in the case of a remote NuGet source. What about other kinds of sources? The need for updating the local cache to avoid repetitive time-consuming remote cache accesses is equally important for scenarios where the remote cache is, say, a network filesystem, an S3 bucket, or a custom http-based location. Are there any plans to address the question more broadly? Thanks. |
Beta Was this translation helpful? Give feedback.
-
Is your feature request related to a problem? Please describe.
(The workflow and the mechanics I base my description on are based on a Windows environment with Visual Studio 2022 and MSBuild)
I am looking for an equivalent to how Nuget operates with the caching of packages fetched from a remote NuGet source.
When referencing a NuGet package in a C# project, NuGet first looks for the respective package in the local cache in %USERPROFILE%.nuget\packages. The local cached copy is used if the package with the requested version is found. In case it is not found locally, NuGet will search for it in the configured remote NuGet sources and subsequently download it, copy the downloaded package to the local cache and then copy it to the configured target location where the C# project will look for it. So, next time this specific package version is asked for, NuGet won't need to download it from a remote location because it is now cached locally.
Currently, vcpkg supports binary caching with local and/or remote cache locations.
Suppose I configure vcpkg to use a local as well as a remote binary cache and provide read-write privileges for both of them. In that case, once vcpkg needs to build a package because it was not found in either location, it will cache the package in the local as well as the remote binary cache.
However, suppose the package exists in the remote binary cache but not in the local one. In that case, vcpkg will download the package from the remote cache and directly copy it to the target directory in the solution. Still, it will not write this downloaded package to the configured local binary cache.
In our use case we work on one repository with a lot of branches, thus most devs have multiple clones of this repository each with a different active branch.
In this configuration, package restore with NuGet is pretty straightforward, as it will only need to download the same package with the same version once from the remote and then use the local copy afterwards for the different clones that use the same package version.
With vcpkg this sadly leads to the packages being downloaded from the remote binary cache every time the packages need to be restored for each clone.
Proposed solution
Could you provide a way to configure binary caching so that after not finding a package in the local cache and thus downloading it from the remote binary cache, you can copy the downloaded package to the local cache? To reduce the unnecessary repeat download of the same package.
Describe alternatives you've considered
Nothing we were able to find or think of seems viable. Manually or automatically copying the VcpkgInstalledDir via a script from one repository clone to another would only mitigate the described problem in case of the identical required package versions. Once another branch needs a different version of a port, it would have to be downloaded again, even if, in the past, that version was already downloaded.
Additional context
If there is already a way to achieve what I described above, this is our current vcpkg configuration included in our Directory.Build.probs in the solution directory:
Beta Was this translation helpful? Give feedback.
All reactions