You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
You got your hands on a FZ, but are stuck on limited data bandwidth and/or data usage on sat internet or "borrowing" your neighbors wifi and want to setup your newly acquired fz with ubers repo/all of it's submodules. To run git clone --recursive that's 6.7GB of data to be downloaded. To save on data usage it would be cool if there was a compressed snapshot of the repo/submodules in the releases that you could pull instead.
Describe the solution you'd like
A clear and concise description of what you want to happen.
Zstandard when faced with a lot of small files such as 90% of what is in Flipper and it's submodules does very favorably in compression ratio(near 50%)/decompression speed(up to 1.8GBs per core) when a dictionary is trained against the data set. I propose that the machine that's used to update the submodules for this repo also does a bi-weekly cron job to create a compressed snapshot of the repo and upload it as a release. I have an example in my Infosec-Cheatsheets repo releases that you can demo it's functionality as the end user. I achieved a 56% original size ratio saving 5GB of data as compared to git pull.
@UberGuidoZ I created a branch in my fork called compress, it contains a working release script to create an automated release containing a stripped zstd compressed archive and scripts to download/extract/checksum everything for windows and linux. The release and download scripts work on the linux side, I have yet to test the windows side of things. If the windows tests are successful, the only thing you would need to do to deploy all of it would be to download the files and change the working dir path in Release.sh/cron job entry(details of how to make a cron job entry in the comments of Release.sh.)
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
You got your hands on a FZ, but are stuck on limited data bandwidth and/or data usage on sat internet or "borrowing" your neighbors wifi and want to setup your newly acquired fz with ubers repo/all of it's submodules. To run
git clone --recursive
that's 6.7GB of data to be downloaded. To save on data usage it would be cool if there was a compressed snapshot of the repo/submodules in the releases that you could pull instead.Describe the solution you'd like
A clear and concise description of what you want to happen.
Zstandard when faced with a lot of small files such as 90% of what is in Flipper and it's submodules does very favorably in compression ratio(near 50%)/decompression speed(up to 1.8GBs per core) when a dictionary is trained against the data set. I propose that the machine that's used to update the submodules for this repo also does a bi-weekly cron job to create a compressed snapshot of the repo and upload it as a release. I have an example in my Infosec-Cheatsheets repo releases that you can demo it's functionality as the end user. I achieved a 56% original size ratio saving 5GB of data as compared to git pull.
Your workflow looks like
If you add this script named
zst.sh
to your repothen the end user can pull/extract it with this one liner
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Pray to the internet gods for more data/bandwidth lol.
Additional context
Add any other context or screenshots about the feature request here.
The text was updated successfully, but these errors were encountered: